00:00:00.001 Started by upstream project "autotest-per-patch" build number 126102 00:00:00.001 originally caused by: 00:00:00.002 Started by user sys_sgci 00:00:00.035 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.036 The recommended git tool is: git 00:00:00.036 using credential 00000000-0000-0000-0000-000000000002 00:00:00.040 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.050 Fetching changes from the remote Git repository 00:00:00.053 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.066 Using shallow fetch with depth 1 00:00:00.066 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.066 > git --version # timeout=10 00:00:00.087 > git --version # 'git version 2.39.2' 00:00:00.087 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.114 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.114 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.955 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.964 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.973 Checking out Revision 308e970df89ed396a3f9dcf22fba8891259694e4 (FETCH_HEAD) 00:00:02.973 > git config core.sparsecheckout # timeout=10 00:00:02.983 > git read-tree -mu HEAD # timeout=10 00:00:02.997 > git checkout -f 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=5 00:00:03.014 Commit message: "jjb/create-perf-report: make job run concurrent" 00:00:03.014 > git rev-list --no-walk 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=10 00:00:03.114 [Pipeline] Start of Pipeline 00:00:03.128 [Pipeline] library 00:00:03.129 Loading library shm_lib@master 00:00:03.130 Library shm_lib@master is cached. Copying from home. 00:00:03.149 [Pipeline] node 00:00:03.159 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:03.161 [Pipeline] { 00:00:03.171 [Pipeline] catchError 00:00:03.173 [Pipeline] { 00:00:03.186 [Pipeline] wrap 00:00:03.193 [Pipeline] { 00:00:03.200 [Pipeline] stage 00:00:03.202 [Pipeline] { (Prologue) 00:00:03.368 [Pipeline] sh 00:00:03.652 + logger -p user.info -t JENKINS-CI 00:00:03.671 [Pipeline] echo 00:00:03.672 Node: WFP50 00:00:03.680 [Pipeline] sh 00:00:03.974 [Pipeline] setCustomBuildProperty 00:00:03.988 [Pipeline] echo 00:00:03.990 Cleanup processes 00:00:03.993 [Pipeline] sh 00:00:04.272 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.273 1857838 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.285 [Pipeline] sh 00:00:04.570 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.570 ++ grep -v 'sudo pgrep' 00:00:04.570 ++ awk '{print $1}' 00:00:04.570 + sudo kill -9 00:00:04.570 + true 00:00:04.585 [Pipeline] cleanWs 00:00:04.596 [WS-CLEANUP] Deleting project workspace... 00:00:04.596 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.601 [WS-CLEANUP] done 00:00:04.606 [Pipeline] setCustomBuildProperty 00:00:04.619 [Pipeline] sh 00:00:04.895 + sudo git config --global --replace-all safe.directory '*' 00:00:04.955 [Pipeline] httpRequest 00:00:04.974 [Pipeline] echo 00:00:04.975 Sorcerer 10.211.164.101 is alive 00:00:04.981 [Pipeline] httpRequest 00:00:04.985 HttpMethod: GET 00:00:04.985 URL: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:04.986 Sending request to url: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:04.993 Response Code: HTTP/1.1 200 OK 00:00:04.993 Success: Status code 200 is in the accepted range: 200,404 00:00:04.993 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:07.217 [Pipeline] sh 00:00:07.495 + tar --no-same-owner -xf jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:07.509 [Pipeline] httpRequest 00:00:07.539 [Pipeline] echo 00:00:07.541 Sorcerer 10.211.164.101 is alive 00:00:07.548 [Pipeline] httpRequest 00:00:07.552 HttpMethod: GET 00:00:07.553 URL: http://10.211.164.101/packages/spdk_b3936a1443c9ac9c12a0d797d932e389ce7a5c85.tar.gz 00:00:07.553 Sending request to url: http://10.211.164.101/packages/spdk_b3936a1443c9ac9c12a0d797d932e389ce7a5c85.tar.gz 00:00:07.563 Response Code: HTTP/1.1 200 OK 00:00:07.564 Success: Status code 200 is in the accepted range: 200,404 00:00:07.564 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_b3936a1443c9ac9c12a0d797d932e389ce7a5c85.tar.gz 00:01:05.901 [Pipeline] sh 00:01:06.187 + tar --no-same-owner -xf spdk_b3936a1443c9ac9c12a0d797d932e389ce7a5c85.tar.gz 00:01:10.389 [Pipeline] sh 00:01:10.673 + git -C spdk log --oneline -n5 00:01:10.673 b3936a144 accel: introduce tasks in sequence limit 00:01:10.673 719d03c6a sock/uring: only register net impl if supported 00:01:10.673 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:01:10.673 9937c0160 lib/rdma: bind TRACE_BDEV_IO_START/DONE to OBJECT_NVMF_RDMA_IO 00:01:10.673 6c7c1f57e accel: add sequence outstanding stat 00:01:10.687 [Pipeline] } 00:01:10.705 [Pipeline] // stage 00:01:10.713 [Pipeline] stage 00:01:10.715 [Pipeline] { (Prepare) 00:01:10.733 [Pipeline] writeFile 00:01:10.750 [Pipeline] sh 00:01:11.031 + logger -p user.info -t JENKINS-CI 00:01:11.045 [Pipeline] sh 00:01:11.328 + logger -p user.info -t JENKINS-CI 00:01:11.340 [Pipeline] sh 00:01:11.624 + cat autorun-spdk.conf 00:01:11.624 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.624 SPDK_TEST_BLOCKDEV=1 00:01:11.624 SPDK_TEST_ISAL=1 00:01:11.624 SPDK_TEST_CRYPTO=1 00:01:11.624 SPDK_TEST_REDUCE=1 00:01:11.624 SPDK_TEST_VBDEV_COMPRESS=1 00:01:11.624 SPDK_RUN_UBSAN=1 00:01:11.632 RUN_NIGHTLY=0 00:01:11.637 [Pipeline] readFile 00:01:11.669 [Pipeline] withEnv 00:01:11.672 [Pipeline] { 00:01:11.687 [Pipeline] sh 00:01:12.000 + set -ex 00:01:12.000 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:12.000 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:12.000 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:12.000 ++ SPDK_TEST_BLOCKDEV=1 00:01:12.000 ++ SPDK_TEST_ISAL=1 00:01:12.000 ++ SPDK_TEST_CRYPTO=1 00:01:12.000 ++ SPDK_TEST_REDUCE=1 00:01:12.000 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:12.000 ++ SPDK_RUN_UBSAN=1 00:01:12.000 ++ RUN_NIGHTLY=0 00:01:12.000 + case $SPDK_TEST_NVMF_NICS in 00:01:12.000 + DRIVERS= 00:01:12.000 + [[ -n '' ]] 00:01:12.000 + exit 0 00:01:12.009 [Pipeline] } 00:01:12.029 [Pipeline] // withEnv 00:01:12.036 [Pipeline] } 00:01:12.053 [Pipeline] // stage 00:01:12.064 [Pipeline] catchError 00:01:12.066 [Pipeline] { 00:01:12.083 [Pipeline] timeout 00:01:12.083 Timeout set to expire in 40 min 00:01:12.085 [Pipeline] { 00:01:12.103 [Pipeline] stage 00:01:12.105 [Pipeline] { (Tests) 00:01:12.123 [Pipeline] sh 00:01:12.411 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:12.411 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:12.411 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:12.411 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:12.411 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:12.411 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:12.411 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:12.411 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:12.411 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:12.411 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:12.411 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:12.411 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:12.411 + source /etc/os-release 00:01:12.411 ++ NAME='Fedora Linux' 00:01:12.411 ++ VERSION='38 (Cloud Edition)' 00:01:12.411 ++ ID=fedora 00:01:12.411 ++ VERSION_ID=38 00:01:12.411 ++ VERSION_CODENAME= 00:01:12.411 ++ PLATFORM_ID=platform:f38 00:01:12.411 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:12.411 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:12.411 ++ LOGO=fedora-logo-icon 00:01:12.411 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:12.411 ++ HOME_URL=https://fedoraproject.org/ 00:01:12.411 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:12.411 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:12.411 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:12.411 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:12.411 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:12.411 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:12.411 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:12.411 ++ SUPPORT_END=2024-05-14 00:01:12.411 ++ VARIANT='Cloud Edition' 00:01:12.411 ++ VARIANT_ID=cloud 00:01:12.411 + uname -a 00:01:12.411 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:12.411 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:15.699 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:01:15.699 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:01:15.699 Hugepages 00:01:15.699 node hugesize free / total 00:01:15.699 node0 1048576kB 0 / 0 00:01:15.699 node0 2048kB 0 / 0 00:01:15.699 node1 1048576kB 0 / 0 00:01:15.699 node1 2048kB 0 / 0 00:01:15.699 00:01:15.699 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:15.699 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:15.699 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:15.699 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:15.699 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:15.699 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:15.699 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:15.699 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:15.699 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:15.699 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:01:15.699 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:15.699 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:15.699 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:15.699 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:15.700 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:15.700 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:15.700 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:15.700 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:15.700 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:01:15.700 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:01:15.700 + rm -f /tmp/spdk-ld-path 00:01:15.700 + source autorun-spdk.conf 00:01:15.700 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:15.700 ++ SPDK_TEST_BLOCKDEV=1 00:01:15.700 ++ SPDK_TEST_ISAL=1 00:01:15.700 ++ SPDK_TEST_CRYPTO=1 00:01:15.700 ++ SPDK_TEST_REDUCE=1 00:01:15.700 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:15.700 ++ SPDK_RUN_UBSAN=1 00:01:15.700 ++ RUN_NIGHTLY=0 00:01:15.700 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:15.700 + [[ -n '' ]] 00:01:15.700 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:15.700 + for M in /var/spdk/build-*-manifest.txt 00:01:15.700 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:15.700 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:15.700 + for M in /var/spdk/build-*-manifest.txt 00:01:15.700 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:15.700 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:15.700 ++ uname 00:01:15.700 + [[ Linux == \L\i\n\u\x ]] 00:01:15.700 + sudo dmesg -T 00:01:15.700 + sudo dmesg --clear 00:01:15.700 + dmesg_pid=1859321 00:01:15.700 + [[ Fedora Linux == FreeBSD ]] 00:01:15.700 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:15.700 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:15.700 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:15.700 + [[ -x /usr/src/fio-static/fio ]] 00:01:15.700 + sudo dmesg -Tw 00:01:15.700 + export FIO_BIN=/usr/src/fio-static/fio 00:01:15.700 + FIO_BIN=/usr/src/fio-static/fio 00:01:15.700 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:15.700 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:15.700 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:15.700 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:15.700 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:15.700 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:15.700 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:15.700 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:15.700 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:15.700 Test configuration: 00:01:15.700 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:15.700 SPDK_TEST_BLOCKDEV=1 00:01:15.700 SPDK_TEST_ISAL=1 00:01:15.700 SPDK_TEST_CRYPTO=1 00:01:15.700 SPDK_TEST_REDUCE=1 00:01:15.700 SPDK_TEST_VBDEV_COMPRESS=1 00:01:15.700 SPDK_RUN_UBSAN=1 00:01:15.700 RUN_NIGHTLY=0 10:27:50 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:15.700 10:27:50 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:15.700 10:27:50 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:15.700 10:27:50 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:15.700 10:27:50 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.700 10:27:50 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.700 10:27:50 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.700 10:27:50 -- paths/export.sh@5 -- $ export PATH 00:01:15.700 10:27:50 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.700 10:27:50 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:15.700 10:27:50 -- common/autobuild_common.sh@444 -- $ date +%s 00:01:15.700 10:27:50 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720772870.XXXXXX 00:01:15.700 10:27:50 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720772870.vJE0m0 00:01:15.700 10:27:50 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:01:15.700 10:27:50 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:01:15.700 10:27:50 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:15.700 10:27:50 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:15.700 10:27:50 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:15.700 10:27:50 -- common/autobuild_common.sh@460 -- $ get_config_params 00:01:15.700 10:27:50 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:15.700 10:27:50 -- common/autotest_common.sh@10 -- $ set +x 00:01:15.700 10:27:50 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:15.700 10:27:50 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:01:15.700 10:27:50 -- pm/common@17 -- $ local monitor 00:01:15.700 10:27:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:15.700 10:27:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:15.700 10:27:50 -- pm/common@21 -- $ date +%s 00:01:15.700 10:27:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:15.700 10:27:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:15.700 10:27:50 -- pm/common@21 -- $ date +%s 00:01:15.700 10:27:50 -- pm/common@25 -- $ sleep 1 00:01:15.700 10:27:50 -- pm/common@21 -- $ date +%s 00:01:15.700 10:27:50 -- pm/common@21 -- $ date +%s 00:01:15.700 10:27:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720772870 00:01:15.700 10:27:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720772870 00:01:15.700 10:27:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720772870 00:01:15.700 10:27:50 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720772870 00:01:15.700 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720772870_collect-vmstat.pm.log 00:01:15.700 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720772870_collect-cpu-load.pm.log 00:01:15.700 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720772870_collect-cpu-temp.pm.log 00:01:15.958 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720772870_collect-bmc-pm.bmc.pm.log 00:01:16.895 10:27:51 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:01:16.895 10:27:51 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:16.895 10:27:51 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:16.895 10:27:51 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:16.895 10:27:51 -- spdk/autobuild.sh@16 -- $ date -u 00:01:16.895 Fri Jul 12 08:27:51 AM UTC 2024 00:01:16.895 10:27:51 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:16.895 v24.09-pre-203-gb3936a144 00:01:16.895 10:27:51 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:16.895 10:27:51 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:16.895 10:27:51 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:16.895 10:27:51 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:16.895 10:27:51 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:16.895 10:27:51 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.895 ************************************ 00:01:16.895 START TEST ubsan 00:01:16.895 ************************************ 00:01:16.895 10:27:51 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:16.895 using ubsan 00:01:16.895 00:01:16.895 real 0m0.001s 00:01:16.895 user 0m0.001s 00:01:16.895 sys 0m0.000s 00:01:16.895 10:27:51 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:16.895 10:27:51 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:16.895 ************************************ 00:01:16.895 END TEST ubsan 00:01:16.895 ************************************ 00:01:16.895 10:27:51 -- common/autotest_common.sh@1142 -- $ return 0 00:01:16.895 10:27:51 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:16.895 10:27:51 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:16.895 10:27:51 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:16.895 10:27:51 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:16.895 10:27:51 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:16.895 10:27:51 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:16.895 10:27:51 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:16.895 10:27:51 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:16.895 10:27:51 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:16.895 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:16.895 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:17.464 Using 'verbs' RDMA provider 00:01:33.721 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:48.626 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:48.626 Creating mk/config.mk...done. 00:01:48.626 Creating mk/cc.flags.mk...done. 00:01:48.626 Type 'make' to build. 00:01:48.626 10:28:21 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:48.626 10:28:21 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:48.626 10:28:21 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:48.626 10:28:21 -- common/autotest_common.sh@10 -- $ set +x 00:01:48.626 ************************************ 00:01:48.626 START TEST make 00:01:48.626 ************************************ 00:01:48.626 10:28:21 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:48.626 make[1]: Nothing to be done for 'all'. 00:02:27.400 The Meson build system 00:02:27.400 Version: 1.3.1 00:02:27.400 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:27.400 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:27.400 Build type: native build 00:02:27.400 Program cat found: YES (/usr/bin/cat) 00:02:27.400 Project name: DPDK 00:02:27.401 Project version: 24.03.0 00:02:27.401 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:27.401 C linker for the host machine: cc ld.bfd 2.39-16 00:02:27.401 Host machine cpu family: x86_64 00:02:27.401 Host machine cpu: x86_64 00:02:27.401 Message: ## Building in Developer Mode ## 00:02:27.401 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:27.401 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:27.401 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:27.401 Program python3 found: YES (/usr/bin/python3) 00:02:27.401 Program cat found: YES (/usr/bin/cat) 00:02:27.401 Compiler for C supports arguments -march=native: YES 00:02:27.401 Checking for size of "void *" : 8 00:02:27.401 Checking for size of "void *" : 8 (cached) 00:02:27.401 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:27.401 Library m found: YES 00:02:27.401 Library numa found: YES 00:02:27.401 Has header "numaif.h" : YES 00:02:27.401 Library fdt found: NO 00:02:27.401 Library execinfo found: NO 00:02:27.401 Has header "execinfo.h" : YES 00:02:27.401 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:27.401 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:27.401 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:27.401 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:27.401 Run-time dependency openssl found: YES 3.0.9 00:02:27.401 Run-time dependency libpcap found: YES 1.10.4 00:02:27.401 Has header "pcap.h" with dependency libpcap: YES 00:02:27.401 Compiler for C supports arguments -Wcast-qual: YES 00:02:27.401 Compiler for C supports arguments -Wdeprecated: YES 00:02:27.401 Compiler for C supports arguments -Wformat: YES 00:02:27.401 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:27.401 Compiler for C supports arguments -Wformat-security: NO 00:02:27.401 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:27.401 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:27.401 Compiler for C supports arguments -Wnested-externs: YES 00:02:27.401 Compiler for C supports arguments -Wold-style-definition: YES 00:02:27.401 Compiler for C supports arguments -Wpointer-arith: YES 00:02:27.401 Compiler for C supports arguments -Wsign-compare: YES 00:02:27.401 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:27.401 Compiler for C supports arguments -Wundef: YES 00:02:27.401 Compiler for C supports arguments -Wwrite-strings: YES 00:02:27.401 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:27.401 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:27.401 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:27.401 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:27.401 Program objdump found: YES (/usr/bin/objdump) 00:02:27.401 Compiler for C supports arguments -mavx512f: YES 00:02:27.401 Checking if "AVX512 checking" compiles: YES 00:02:27.401 Fetching value of define "__SSE4_2__" : 1 00:02:27.401 Fetching value of define "__AES__" : 1 00:02:27.401 Fetching value of define "__AVX__" : 1 00:02:27.401 Fetching value of define "__AVX2__" : 1 00:02:27.401 Fetching value of define "__AVX512BW__" : 1 00:02:27.401 Fetching value of define "__AVX512CD__" : 1 00:02:27.401 Fetching value of define "__AVX512DQ__" : 1 00:02:27.401 Fetching value of define "__AVX512F__" : 1 00:02:27.401 Fetching value of define "__AVX512VL__" : 1 00:02:27.401 Fetching value of define "__PCLMUL__" : 1 00:02:27.401 Fetching value of define "__RDRND__" : 1 00:02:27.401 Fetching value of define "__RDSEED__" : 1 00:02:27.401 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:27.401 Fetching value of define "__znver1__" : (undefined) 00:02:27.401 Fetching value of define "__znver2__" : (undefined) 00:02:27.401 Fetching value of define "__znver3__" : (undefined) 00:02:27.401 Fetching value of define "__znver4__" : (undefined) 00:02:27.401 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:27.401 Message: lib/log: Defining dependency "log" 00:02:27.401 Message: lib/kvargs: Defining dependency "kvargs" 00:02:27.401 Message: lib/telemetry: Defining dependency "telemetry" 00:02:27.401 Checking for function "getentropy" : NO 00:02:27.401 Message: lib/eal: Defining dependency "eal" 00:02:27.401 Message: lib/ring: Defining dependency "ring" 00:02:27.401 Message: lib/rcu: Defining dependency "rcu" 00:02:27.401 Message: lib/mempool: Defining dependency "mempool" 00:02:27.401 Message: lib/mbuf: Defining dependency "mbuf" 00:02:27.401 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:27.401 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:27.401 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:27.401 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:27.401 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:27.401 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:27.401 Compiler for C supports arguments -mpclmul: YES 00:02:27.401 Compiler for C supports arguments -maes: YES 00:02:27.401 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:27.401 Compiler for C supports arguments -mavx512bw: YES 00:02:27.401 Compiler for C supports arguments -mavx512dq: YES 00:02:27.401 Compiler for C supports arguments -mavx512vl: YES 00:02:27.401 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:27.401 Compiler for C supports arguments -mavx2: YES 00:02:27.401 Compiler for C supports arguments -mavx: YES 00:02:27.401 Message: lib/net: Defining dependency "net" 00:02:27.401 Message: lib/meter: Defining dependency "meter" 00:02:27.401 Message: lib/ethdev: Defining dependency "ethdev" 00:02:27.401 Message: lib/pci: Defining dependency "pci" 00:02:27.401 Message: lib/cmdline: Defining dependency "cmdline" 00:02:27.401 Message: lib/hash: Defining dependency "hash" 00:02:27.401 Message: lib/timer: Defining dependency "timer" 00:02:27.401 Message: lib/compressdev: Defining dependency "compressdev" 00:02:27.401 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:27.401 Message: lib/dmadev: Defining dependency "dmadev" 00:02:27.401 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:27.401 Message: lib/power: Defining dependency "power" 00:02:27.401 Message: lib/reorder: Defining dependency "reorder" 00:02:27.401 Message: lib/security: Defining dependency "security" 00:02:27.401 Has header "linux/userfaultfd.h" : YES 00:02:27.401 Has header "linux/vduse.h" : YES 00:02:27.401 Message: lib/vhost: Defining dependency "vhost" 00:02:27.401 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:27.401 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:27.401 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:27.401 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:27.401 Compiler for C supports arguments -std=c11: YES 00:02:27.401 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:27.401 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:27.401 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:27.401 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:27.401 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:27.401 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:27.401 Library mtcr_ul found: NO 00:02:27.401 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:27.401 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:27.401 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:27.401 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:28.333 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:28.333 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:28.333 Configuring mlx5_autoconf.h using configuration 00:02:28.333 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:28.333 Run-time dependency libcrypto found: YES 3.0.9 00:02:28.333 Library IPSec_MB found: YES 00:02:28.333 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:28.333 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:28.333 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:28.333 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:28.333 Library IPSec_MB found: YES 00:02:28.333 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:28.333 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:28.333 Compiler for C supports arguments -std=c11: YES (cached) 00:02:28.333 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:28.333 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:28.333 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:28.333 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:28.333 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:28.333 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:28.333 Library libisal found: NO 00:02:28.333 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:28.333 Compiler for C supports arguments -std=c11: YES (cached) 00:02:28.333 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:28.333 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:28.333 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:28.333 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:28.333 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:28.333 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:28.333 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:28.333 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:28.333 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:28.333 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:28.333 Program doxygen found: YES (/usr/bin/doxygen) 00:02:28.333 Configuring doxy-api-html.conf using configuration 00:02:28.333 Configuring doxy-api-man.conf using configuration 00:02:28.333 Program mandb found: YES (/usr/bin/mandb) 00:02:28.333 Program sphinx-build found: NO 00:02:28.333 Configuring rte_build_config.h using configuration 00:02:28.334 Message: 00:02:28.334 ================= 00:02:28.334 Applications Enabled 00:02:28.334 ================= 00:02:28.334 00:02:28.334 apps: 00:02:28.334 00:02:28.334 00:02:28.334 Message: 00:02:28.334 ================= 00:02:28.334 Libraries Enabled 00:02:28.334 ================= 00:02:28.334 00:02:28.334 libs: 00:02:28.334 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:28.334 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:28.334 cryptodev, dmadev, power, reorder, security, vhost, 00:02:28.334 00:02:28.334 Message: 00:02:28.334 =============== 00:02:28.334 Drivers Enabled 00:02:28.334 =============== 00:02:28.334 00:02:28.334 common: 00:02:28.334 mlx5, qat, 00:02:28.334 bus: 00:02:28.334 auxiliary, pci, vdev, 00:02:28.334 mempool: 00:02:28.334 ring, 00:02:28.334 dma: 00:02:28.334 00:02:28.334 net: 00:02:28.334 00:02:28.334 crypto: 00:02:28.334 ipsec_mb, mlx5, 00:02:28.334 compress: 00:02:28.334 isal, mlx5, 00:02:28.334 vdpa: 00:02:28.334 00:02:28.334 00:02:28.334 Message: 00:02:28.334 ================= 00:02:28.334 Content Skipped 00:02:28.334 ================= 00:02:28.334 00:02:28.334 apps: 00:02:28.334 dumpcap: explicitly disabled via build config 00:02:28.334 graph: explicitly disabled via build config 00:02:28.334 pdump: explicitly disabled via build config 00:02:28.334 proc-info: explicitly disabled via build config 00:02:28.334 test-acl: explicitly disabled via build config 00:02:28.334 test-bbdev: explicitly disabled via build config 00:02:28.334 test-cmdline: explicitly disabled via build config 00:02:28.334 test-compress-perf: explicitly disabled via build config 00:02:28.334 test-crypto-perf: explicitly disabled via build config 00:02:28.334 test-dma-perf: explicitly disabled via build config 00:02:28.334 test-eventdev: explicitly disabled via build config 00:02:28.334 test-fib: explicitly disabled via build config 00:02:28.334 test-flow-perf: explicitly disabled via build config 00:02:28.334 test-gpudev: explicitly disabled via build config 00:02:28.334 test-mldev: explicitly disabled via build config 00:02:28.334 test-pipeline: explicitly disabled via build config 00:02:28.334 test-pmd: explicitly disabled via build config 00:02:28.334 test-regex: explicitly disabled via build config 00:02:28.334 test-sad: explicitly disabled via build config 00:02:28.334 test-security-perf: explicitly disabled via build config 00:02:28.334 00:02:28.334 libs: 00:02:28.334 argparse: explicitly disabled via build config 00:02:28.334 metrics: explicitly disabled via build config 00:02:28.334 acl: explicitly disabled via build config 00:02:28.334 bbdev: explicitly disabled via build config 00:02:28.334 bitratestats: explicitly disabled via build config 00:02:28.334 bpf: explicitly disabled via build config 00:02:28.334 cfgfile: explicitly disabled via build config 00:02:28.334 distributor: explicitly disabled via build config 00:02:28.334 efd: explicitly disabled via build config 00:02:28.334 eventdev: explicitly disabled via build config 00:02:28.334 dispatcher: explicitly disabled via build config 00:02:28.334 gpudev: explicitly disabled via build config 00:02:28.334 gro: explicitly disabled via build config 00:02:28.334 gso: explicitly disabled via build config 00:02:28.334 ip_frag: explicitly disabled via build config 00:02:28.334 jobstats: explicitly disabled via build config 00:02:28.334 latencystats: explicitly disabled via build config 00:02:28.334 lpm: explicitly disabled via build config 00:02:28.334 member: explicitly disabled via build config 00:02:28.334 pcapng: explicitly disabled via build config 00:02:28.334 rawdev: explicitly disabled via build config 00:02:28.334 regexdev: explicitly disabled via build config 00:02:28.334 mldev: explicitly disabled via build config 00:02:28.334 rib: explicitly disabled via build config 00:02:28.334 sched: explicitly disabled via build config 00:02:28.334 stack: explicitly disabled via build config 00:02:28.334 ipsec: explicitly disabled via build config 00:02:28.334 pdcp: explicitly disabled via build config 00:02:28.334 fib: explicitly disabled via build config 00:02:28.334 port: explicitly disabled via build config 00:02:28.334 pdump: explicitly disabled via build config 00:02:28.334 table: explicitly disabled via build config 00:02:28.334 pipeline: explicitly disabled via build config 00:02:28.334 graph: explicitly disabled via build config 00:02:28.334 node: explicitly disabled via build config 00:02:28.334 00:02:28.334 drivers: 00:02:28.334 common/cpt: not in enabled drivers build config 00:02:28.334 common/dpaax: not in enabled drivers build config 00:02:28.334 common/iavf: not in enabled drivers build config 00:02:28.334 common/idpf: not in enabled drivers build config 00:02:28.334 common/ionic: not in enabled drivers build config 00:02:28.334 common/mvep: not in enabled drivers build config 00:02:28.334 common/octeontx: not in enabled drivers build config 00:02:28.334 bus/cdx: not in enabled drivers build config 00:02:28.334 bus/dpaa: not in enabled drivers build config 00:02:28.334 bus/fslmc: not in enabled drivers build config 00:02:28.334 bus/ifpga: not in enabled drivers build config 00:02:28.334 bus/platform: not in enabled drivers build config 00:02:28.334 bus/uacce: not in enabled drivers build config 00:02:28.334 bus/vmbus: not in enabled drivers build config 00:02:28.334 common/cnxk: not in enabled drivers build config 00:02:28.334 common/nfp: not in enabled drivers build config 00:02:28.334 common/nitrox: not in enabled drivers build config 00:02:28.334 common/sfc_efx: not in enabled drivers build config 00:02:28.334 mempool/bucket: not in enabled drivers build config 00:02:28.334 mempool/cnxk: not in enabled drivers build config 00:02:28.334 mempool/dpaa: not in enabled drivers build config 00:02:28.334 mempool/dpaa2: not in enabled drivers build config 00:02:28.334 mempool/octeontx: not in enabled drivers build config 00:02:28.334 mempool/stack: not in enabled drivers build config 00:02:28.334 dma/cnxk: not in enabled drivers build config 00:02:28.334 dma/dpaa: not in enabled drivers build config 00:02:28.334 dma/dpaa2: not in enabled drivers build config 00:02:28.334 dma/hisilicon: not in enabled drivers build config 00:02:28.334 dma/idxd: not in enabled drivers build config 00:02:28.334 dma/ioat: not in enabled drivers build config 00:02:28.334 dma/skeleton: not in enabled drivers build config 00:02:28.334 net/af_packet: not in enabled drivers build config 00:02:28.334 net/af_xdp: not in enabled drivers build config 00:02:28.334 net/ark: not in enabled drivers build config 00:02:28.334 net/atlantic: not in enabled drivers build config 00:02:28.334 net/avp: not in enabled drivers build config 00:02:28.334 net/axgbe: not in enabled drivers build config 00:02:28.334 net/bnx2x: not in enabled drivers build config 00:02:28.334 net/bnxt: not in enabled drivers build config 00:02:28.334 net/bonding: not in enabled drivers build config 00:02:28.334 net/cnxk: not in enabled drivers build config 00:02:28.334 net/cpfl: not in enabled drivers build config 00:02:28.334 net/cxgbe: not in enabled drivers build config 00:02:28.334 net/dpaa: not in enabled drivers build config 00:02:28.334 net/dpaa2: not in enabled drivers build config 00:02:28.334 net/e1000: not in enabled drivers build config 00:02:28.334 net/ena: not in enabled drivers build config 00:02:28.334 net/enetc: not in enabled drivers build config 00:02:28.334 net/enetfec: not in enabled drivers build config 00:02:28.334 net/enic: not in enabled drivers build config 00:02:28.334 net/failsafe: not in enabled drivers build config 00:02:28.334 net/fm10k: not in enabled drivers build config 00:02:28.334 net/gve: not in enabled drivers build config 00:02:28.334 net/hinic: not in enabled drivers build config 00:02:28.334 net/hns3: not in enabled drivers build config 00:02:28.334 net/i40e: not in enabled drivers build config 00:02:28.334 net/iavf: not in enabled drivers build config 00:02:28.334 net/ice: not in enabled drivers build config 00:02:28.334 net/idpf: not in enabled drivers build config 00:02:28.334 net/igc: not in enabled drivers build config 00:02:28.334 net/ionic: not in enabled drivers build config 00:02:28.334 net/ipn3ke: not in enabled drivers build config 00:02:28.334 net/ixgbe: not in enabled drivers build config 00:02:28.334 net/mana: not in enabled drivers build config 00:02:28.334 net/memif: not in enabled drivers build config 00:02:28.334 net/mlx4: not in enabled drivers build config 00:02:28.334 net/mlx5: not in enabled drivers build config 00:02:28.334 net/mvneta: not in enabled drivers build config 00:02:28.334 net/mvpp2: not in enabled drivers build config 00:02:28.334 net/netvsc: not in enabled drivers build config 00:02:28.334 net/nfb: not in enabled drivers build config 00:02:28.334 net/nfp: not in enabled drivers build config 00:02:28.334 net/ngbe: not in enabled drivers build config 00:02:28.334 net/null: not in enabled drivers build config 00:02:28.334 net/octeontx: not in enabled drivers build config 00:02:28.334 net/octeon_ep: not in enabled drivers build config 00:02:28.334 net/pcap: not in enabled drivers build config 00:02:28.334 net/pfe: not in enabled drivers build config 00:02:28.334 net/qede: not in enabled drivers build config 00:02:28.334 net/ring: not in enabled drivers build config 00:02:28.334 net/sfc: not in enabled drivers build config 00:02:28.334 net/softnic: not in enabled drivers build config 00:02:28.334 net/tap: not in enabled drivers build config 00:02:28.334 net/thunderx: not in enabled drivers build config 00:02:28.334 net/txgbe: not in enabled drivers build config 00:02:28.334 net/vdev_netvsc: not in enabled drivers build config 00:02:28.334 net/vhost: not in enabled drivers build config 00:02:28.334 net/virtio: not in enabled drivers build config 00:02:28.334 net/vmxnet3: not in enabled drivers build config 00:02:28.334 raw/*: missing internal dependency, "rawdev" 00:02:28.334 crypto/armv8: not in enabled drivers build config 00:02:28.334 crypto/bcmfs: not in enabled drivers build config 00:02:28.334 crypto/caam_jr: not in enabled drivers build config 00:02:28.334 crypto/ccp: not in enabled drivers build config 00:02:28.334 crypto/cnxk: not in enabled drivers build config 00:02:28.334 crypto/dpaa_sec: not in enabled drivers build config 00:02:28.334 crypto/dpaa2_sec: not in enabled drivers build config 00:02:28.334 crypto/mvsam: not in enabled drivers build config 00:02:28.334 crypto/nitrox: not in enabled drivers build config 00:02:28.334 crypto/null: not in enabled drivers build config 00:02:28.334 crypto/octeontx: not in enabled drivers build config 00:02:28.334 crypto/openssl: not in enabled drivers build config 00:02:28.334 crypto/scheduler: not in enabled drivers build config 00:02:28.334 crypto/uadk: not in enabled drivers build config 00:02:28.334 crypto/virtio: not in enabled drivers build config 00:02:28.334 compress/nitrox: not in enabled drivers build config 00:02:28.334 compress/octeontx: not in enabled drivers build config 00:02:28.334 compress/zlib: not in enabled drivers build config 00:02:28.334 regex/*: missing internal dependency, "regexdev" 00:02:28.334 ml/*: missing internal dependency, "mldev" 00:02:28.334 vdpa/ifc: not in enabled drivers build config 00:02:28.334 vdpa/mlx5: not in enabled drivers build config 00:02:28.334 vdpa/nfp: not in enabled drivers build config 00:02:28.334 vdpa/sfc: not in enabled drivers build config 00:02:28.334 event/*: missing internal dependency, "eventdev" 00:02:28.334 baseband/*: missing internal dependency, "bbdev" 00:02:28.334 gpu/*: missing internal dependency, "gpudev" 00:02:28.334 00:02:28.334 00:02:29.263 Build targets in project: 115 00:02:29.263 00:02:29.263 DPDK 24.03.0 00:02:29.263 00:02:29.263 User defined options 00:02:29.263 buildtype : debug 00:02:29.263 default_library : shared 00:02:29.263 libdir : lib 00:02:29.263 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:29.263 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:29.263 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:29.263 cpu_instruction_set: native 00:02:29.263 disable_apps : test-dma-perf,test,test-sad,test-acl,test-pmd,test-mldev,test-compress-perf,test-cmdline,test-regex,test-fib,graph,test-bbdev,dumpcap,test-gpudev,proc-info,test-pipeline,test-flow-perf,test-crypto-perf,pdump,test-eventdev,test-security-perf 00:02:29.263 disable_libs : port,lpm,ipsec,regexdev,dispatcher,argparse,bitratestats,rawdev,stack,graph,acl,bbdev,pipeline,member,sched,pcapng,mldev,eventdev,efd,metrics,latencystats,cfgfile,ip_frag,jobstats,pdump,pdcp,rib,node,fib,distributor,gso,table,bpf,gpudev,gro 00:02:29.263 enable_docs : false 00:02:29.263 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:29.263 enable_kmods : false 00:02:29.263 max_lcores : 128 00:02:29.263 tests : false 00:02:29.263 00:02:29.263 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:29.834 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:29.834 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:29.834 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:29.834 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:29.834 [4/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:29.834 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:29.834 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:29.834 [7/378] Linking static target lib/librte_kvargs.a 00:02:29.834 [8/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:29.834 [9/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:29.834 [10/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:29.834 [11/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:29.834 [12/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:30.097 [13/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:30.097 [14/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:30.097 [15/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:30.097 [16/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:30.097 [17/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:30.097 [18/378] Linking static target lib/librte_log.a 00:02:30.097 [19/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:30.359 [20/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:30.359 [21/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:30.359 [22/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:30.359 [23/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:30.359 [24/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:30.359 [25/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.359 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:30.359 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:30.359 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:30.359 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:30.359 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:30.359 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:30.359 [32/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:30.359 [33/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:30.359 [34/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:30.359 [35/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:30.359 [36/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:30.618 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:30.618 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:30.618 [39/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:30.618 [40/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:30.618 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:30.618 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:30.618 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:30.618 [44/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:30.618 [45/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:30.618 [46/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:30.618 [47/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:30.618 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:30.618 [49/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:30.618 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:30.618 [51/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:30.618 [52/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:30.618 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:30.618 [54/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:30.618 [55/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:30.618 [56/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:30.618 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:30.618 [58/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:30.618 [59/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:30.618 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:30.618 [61/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:30.618 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:30.618 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:30.618 [64/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:30.618 [65/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:30.618 [66/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:30.618 [67/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:30.618 [68/378] Linking static target lib/librte_pci.a 00:02:30.618 [69/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:30.618 [70/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:30.618 [71/378] Linking static target lib/librte_ring.a 00:02:30.618 [72/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:30.618 [73/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:30.618 [74/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:30.618 [75/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:30.618 [76/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:30.618 [77/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:30.618 [78/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:30.618 [79/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:30.618 [80/378] Linking static target lib/librte_telemetry.a 00:02:30.618 [81/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:30.618 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:30.618 [83/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:30.619 [84/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:30.619 [85/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:30.619 [86/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:30.619 [87/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:30.619 [88/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:30.619 [89/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:30.619 [90/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:30.619 [91/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:30.619 [92/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:30.619 [93/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:30.619 [94/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:30.619 [95/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:30.619 [96/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:30.619 [97/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:30.619 [98/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:30.882 [99/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:30.882 [100/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:30.882 [101/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:30.882 [102/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:30.882 [103/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:30.882 [104/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:30.882 [105/378] Linking static target lib/librte_rcu.a 00:02:30.882 [106/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:30.882 [107/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:30.882 [108/378] Linking static target lib/librte_mempool.a 00:02:30.882 [109/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:30.882 [110/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:30.882 [111/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:30.882 [112/378] Linking static target lib/librte_net.a 00:02:30.882 [113/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.882 [114/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:30.882 [115/378] Linking target lib/librte_log.so.24.1 00:02:30.882 [116/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:30.882 [117/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:30.882 [118/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.149 [119/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:31.149 [120/378] Linking static target lib/librte_meter.a 00:02:31.149 [121/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:31.149 [122/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:31.149 [123/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:31.149 [124/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:31.149 [125/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:31.149 [126/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:31.149 [127/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:31.149 [128/378] Linking static target lib/librte_mbuf.a 00:02:31.149 [129/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:31.149 [130/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:31.149 [131/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.149 [132/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:31.149 [133/378] Linking static target lib/librte_cmdline.a 00:02:31.149 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:31.149 [135/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:31.149 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:31.149 [137/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:31.149 [138/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:31.149 [139/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:31.149 [140/378] Linking target lib/librte_kvargs.so.24.1 00:02:31.408 [141/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:31.408 [142/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:31.408 [143/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:31.408 [144/378] Linking static target lib/librte_timer.a 00:02:31.408 [145/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:31.408 [146/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:31.408 [147/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:31.408 [148/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:31.408 [149/378] Linking static target lib/librte_eal.a 00:02:31.409 [150/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:31.409 [151/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:31.409 [152/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:31.409 [153/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.409 [154/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:31.409 [155/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.409 [156/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:31.409 [157/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:31.409 [158/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:31.409 [159/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:31.409 [160/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:31.409 [161/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:31.409 [162/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:31.409 [163/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:31.409 [164/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:31.409 [165/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:31.409 [166/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:31.409 [167/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:31.409 [168/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:31.409 [169/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.409 [170/378] Linking static target lib/librte_dmadev.a 00:02:31.409 [171/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:31.409 [172/378] Linking static target lib/librte_compressdev.a 00:02:31.409 [173/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:31.409 [174/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:31.409 [175/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.409 [176/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:31.409 [177/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:31.409 [178/378] Linking target lib/librte_telemetry.so.24.1 00:02:31.409 [179/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:31.678 [180/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:31.678 [181/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:31.678 [182/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:31.678 [183/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:31.678 [184/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:31.678 [185/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:31.678 [186/378] Linking static target lib/librte_power.a 00:02:31.678 [187/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:31.678 [188/378] Linking static target lib/librte_reorder.a 00:02:31.678 [189/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:31.678 [190/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:31.678 [191/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:31.678 [192/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:31.678 [193/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:31.678 [194/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:31.678 [195/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:31.678 [196/378] Linking static target lib/librte_security.a 00:02:31.678 [197/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:31.940 [198/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:31.940 [199/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:31.940 [200/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:31.940 [201/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:31.940 [202/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:31.940 [203/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:31.940 [204/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:31.941 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:31.941 [206/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:31.941 [207/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.941 [208/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:31.941 [209/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:31.941 [210/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:31.941 [211/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:31.941 [212/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:31.941 [213/378] Linking static target lib/librte_hash.a 00:02:31.941 [214/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:31.941 [215/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:31.941 [216/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:31.941 [217/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:31.941 [218/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:31.941 [219/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:31.941 [220/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:31.941 [221/378] Linking static target drivers/librte_bus_vdev.a 00:02:31.941 [222/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:31.941 [223/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:31.941 [224/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:31.941 [225/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:31.941 [226/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:31.941 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:31.941 [228/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.941 [229/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:31.941 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:31.941 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:32.200 [232/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:32.200 [233/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:32.200 [234/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:32.200 [235/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:32.200 [236/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:32.200 [237/378] Linking static target drivers/librte_bus_pci.a 00:02:32.200 [238/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:32.200 [239/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:32.200 [240/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.200 [241/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.200 [242/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:32.200 [243/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:32.200 [244/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:32.200 [245/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:32.200 [246/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:32.200 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:32.200 [248/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.200 [249/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:32.200 [250/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.200 [251/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:32.200 [252/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:32.200 [253/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:32.200 [254/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.200 [255/378] Linking static target lib/librte_cryptodev.a 00:02:32.200 [256/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:32.459 [257/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:32.459 [258/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:32.459 [259/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:32.459 [260/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.459 [261/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.459 [262/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:32.459 [263/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:32.459 [264/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:32.459 [265/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:32.459 [266/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:32.459 [267/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:32.459 [268/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.459 [269/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:32.459 [270/378] Linking static target lib/librte_ethdev.a 00:02:32.459 [271/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:32.459 [272/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:32.459 [273/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:32.459 [274/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:32.459 [275/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:32.459 [276/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:32.718 [277/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.718 [278/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:32.718 [279/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:32.718 [280/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:32.718 [281/378] Linking static target drivers/librte_mempool_ring.a 00:02:32.718 [282/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:32.718 [283/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:32.718 [284/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:32.718 [285/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:32.718 [286/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:32.718 [287/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:32.718 [288/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:32.718 [289/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:32.718 [290/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:32.718 [291/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:32.718 [292/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:32.718 [293/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:32.718 [294/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:32.718 [295/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:32.976 [296/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:32.976 [297/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:32.976 [298/378] Linking static target drivers/librte_compress_mlx5.a 00:02:32.976 [299/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.976 [300/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.976 [301/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:32.976 [302/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:32.976 [303/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:32.976 [304/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:32.976 [305/378] Linking static target drivers/librte_compress_isal.a 00:02:32.976 [306/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:32.976 [307/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:32.976 [308/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:32.976 [309/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:32.976 [310/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:32.976 [311/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:32.976 [312/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:32.976 [313/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:32.976 [314/378] Linking static target drivers/librte_common_mlx5.a 00:02:32.976 [315/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:32.976 [316/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:33.234 [317/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:33.493 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:33.493 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:33.751 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:33.751 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:33.751 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:33.751 [323/378] Linking static target drivers/librte_common_qat.a 00:02:34.318 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:34.318 [325/378] Linking static target lib/librte_vhost.a 00:02:34.318 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.849 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.382 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.916 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:43.817 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:44.075 [331/378] Linking target lib/librte_eal.so.24.1 00:02:44.075 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:44.075 [333/378] Linking target lib/librte_timer.so.24.1 00:02:44.075 [334/378] Linking target lib/librte_ring.so.24.1 00:02:44.334 [335/378] Linking target lib/librte_meter.so.24.1 00:02:44.334 [336/378] Linking target lib/librte_pci.so.24.1 00:02:44.334 [337/378] Linking target lib/librte_dmadev.so.24.1 00:02:44.334 [338/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:44.334 [339/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:44.334 [340/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:44.334 [341/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:44.334 [342/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:44.334 [343/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:44.334 [344/378] Linking target lib/librte_rcu.so.24.1 00:02:44.334 [345/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:44.334 [346/378] Linking target lib/librte_mempool.so.24.1 00:02:44.334 [347/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:44.334 [348/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:44.334 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:44.592 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:44.592 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:44.592 [352/378] Linking target lib/librte_mbuf.so.24.1 00:02:44.592 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:44.592 [354/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:44.850 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:44.850 [356/378] Linking target lib/librte_compressdev.so.24.1 00:02:44.850 [357/378] Linking target lib/librte_reorder.so.24.1 00:02:44.850 [358/378] Linking target lib/librte_net.so.24.1 00:02:44.850 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:44.850 [360/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:44.850 [361/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:45.109 [362/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:45.109 [363/378] Linking target lib/librte_security.so.24.1 00:02:45.109 [364/378] Linking target lib/librte_hash.so.24.1 00:02:45.109 [365/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:45.109 [366/378] Linking target lib/librte_cmdline.so.24.1 00:02:45.109 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:45.109 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:45.109 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:45.368 [370/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:45.368 [371/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:45.368 [372/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:45.368 [373/378] Linking target lib/librte_power.so.24.1 00:02:45.626 [374/378] Linking target lib/librte_vhost.so.24.1 00:02:45.626 [375/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:45.626 [376/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:45.626 [377/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:45.626 [378/378] Linking target drivers/librte_common_qat.so.24.1 00:02:45.626 INFO: autodetecting backend as ninja 00:02:45.626 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:47.040 CC lib/ut/ut.o 00:02:47.040 CC lib/log/log.o 00:02:47.040 CC lib/log/log_flags.o 00:02:47.040 CC lib/log/log_deprecated.o 00:02:47.040 CC lib/ut_mock/mock.o 00:02:47.040 LIB libspdk_ut.a 00:02:47.040 SO libspdk_ut.so.2.0 00:02:47.040 LIB libspdk_ut_mock.a 00:02:47.040 SYMLINK libspdk_ut.so 00:02:47.040 SO libspdk_ut_mock.so.6.0 00:02:47.299 SYMLINK libspdk_ut_mock.so 00:02:47.299 LIB libspdk_log.a 00:02:47.299 SO libspdk_log.so.7.0 00:02:47.299 SYMLINK libspdk_log.so 00:02:47.866 CC lib/util/base64.o 00:02:47.866 CC lib/util/cpuset.o 00:02:47.866 CC lib/util/bit_array.o 00:02:47.866 CC lib/util/crc32.o 00:02:47.866 CC lib/util/crc16.o 00:02:47.866 CC lib/util/crc32c.o 00:02:47.866 CC lib/util/crc32_ieee.o 00:02:47.866 CC lib/util/crc64.o 00:02:47.866 CC lib/util/dif.o 00:02:47.866 CC lib/util/file.o 00:02:47.866 CC lib/util/fd.o 00:02:47.866 CC lib/util/hexlify.o 00:02:47.866 CC lib/dma/dma.o 00:02:47.866 CC lib/util/math.o 00:02:47.866 CC lib/util/iov.o 00:02:47.866 CXX lib/trace_parser/trace.o 00:02:47.866 CC lib/util/pipe.o 00:02:47.866 CC lib/util/strerror_tls.o 00:02:47.866 CC lib/util/string.o 00:02:47.866 CC lib/util/uuid.o 00:02:47.866 CC lib/util/fd_group.o 00:02:47.866 CC lib/util/xor.o 00:02:47.866 CC lib/ioat/ioat.o 00:02:47.866 CC lib/util/zipf.o 00:02:47.866 CC lib/vfio_user/host/vfio_user_pci.o 00:02:47.866 CC lib/vfio_user/host/vfio_user.o 00:02:47.866 LIB libspdk_dma.a 00:02:47.866 SO libspdk_dma.so.4.0 00:02:48.123 SYMLINK libspdk_dma.so 00:02:48.123 LIB libspdk_ioat.a 00:02:48.123 SO libspdk_ioat.so.7.0 00:02:48.123 LIB libspdk_util.a 00:02:48.123 SYMLINK libspdk_ioat.so 00:02:48.123 LIB libspdk_vfio_user.a 00:02:48.123 SO libspdk_vfio_user.so.5.0 00:02:48.381 SO libspdk_util.so.9.1 00:02:48.381 SYMLINK libspdk_vfio_user.so 00:02:48.381 SYMLINK libspdk_util.so 00:02:48.639 LIB libspdk_trace_parser.a 00:02:48.639 SO libspdk_trace_parser.so.5.0 00:02:48.896 CC lib/json/json_parse.o 00:02:48.896 CC lib/json/json_util.o 00:02:48.896 CC lib/json/json_write.o 00:02:48.896 CC lib/rdma_utils/rdma_utils.o 00:02:48.896 CC lib/vmd/vmd.o 00:02:48.896 CC lib/vmd/led.o 00:02:48.896 CC lib/rdma_provider/common.o 00:02:48.896 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:48.896 CC lib/idxd/idxd.o 00:02:48.896 CC lib/env_dpdk/env.o 00:02:48.896 CC lib/idxd/idxd_user.o 00:02:48.896 CC lib/env_dpdk/memory.o 00:02:48.896 CC lib/idxd/idxd_kernel.o 00:02:48.896 CC lib/conf/conf.o 00:02:48.896 CC lib/env_dpdk/pci.o 00:02:48.896 CC lib/env_dpdk/init.o 00:02:48.896 CC lib/env_dpdk/threads.o 00:02:48.896 CC lib/env_dpdk/pci_ioat.o 00:02:48.896 CC lib/env_dpdk/pci_virtio.o 00:02:48.896 CC lib/env_dpdk/pci_vmd.o 00:02:48.896 CC lib/env_dpdk/pci_idxd.o 00:02:48.896 CC lib/env_dpdk/sigbus_handler.o 00:02:48.896 CC lib/env_dpdk/pci_event.o 00:02:48.896 CC lib/env_dpdk/pci_dpdk.o 00:02:48.896 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:48.896 CC lib/reduce/reduce.o 00:02:48.896 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:48.896 SYMLINK libspdk_trace_parser.so 00:02:49.153 LIB libspdk_rdma_provider.a 00:02:49.153 SO libspdk_rdma_provider.so.6.0 00:02:49.153 LIB libspdk_conf.a 00:02:49.153 LIB libspdk_rdma_utils.a 00:02:49.153 LIB libspdk_json.a 00:02:49.153 SO libspdk_conf.so.6.0 00:02:49.153 SO libspdk_rdma_utils.so.1.0 00:02:49.153 SYMLINK libspdk_rdma_provider.so 00:02:49.153 SO libspdk_json.so.6.0 00:02:49.153 SYMLINK libspdk_conf.so 00:02:49.153 SYMLINK libspdk_rdma_utils.so 00:02:49.153 SYMLINK libspdk_json.so 00:02:49.411 LIB libspdk_reduce.a 00:02:49.411 SO libspdk_reduce.so.6.0 00:02:49.411 LIB libspdk_idxd.a 00:02:49.411 SO libspdk_idxd.so.12.0 00:02:49.411 SYMLINK libspdk_reduce.so 00:02:49.411 LIB libspdk_vmd.a 00:02:49.411 SO libspdk_vmd.so.6.0 00:02:49.411 SYMLINK libspdk_idxd.so 00:02:49.670 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:49.670 CC lib/jsonrpc/jsonrpc_server.o 00:02:49.670 CC lib/jsonrpc/jsonrpc_client.o 00:02:49.670 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:49.670 SYMLINK libspdk_vmd.so 00:02:49.928 LIB libspdk_jsonrpc.a 00:02:49.928 SO libspdk_jsonrpc.so.6.0 00:02:49.928 SYMLINK libspdk_jsonrpc.so 00:02:50.187 LIB libspdk_env_dpdk.a 00:02:50.187 SO libspdk_env_dpdk.so.14.1 00:02:50.445 CC lib/rpc/rpc.o 00:02:50.445 SYMLINK libspdk_env_dpdk.so 00:02:50.445 LIB libspdk_rpc.a 00:02:50.703 SO libspdk_rpc.so.6.0 00:02:50.703 SYMLINK libspdk_rpc.so 00:02:50.961 CC lib/notify/notify.o 00:02:50.961 CC lib/notify/notify_rpc.o 00:02:50.961 CC lib/keyring/keyring_rpc.o 00:02:50.961 CC lib/keyring/keyring.o 00:02:50.961 CC lib/trace/trace_rpc.o 00:02:50.961 CC lib/trace/trace.o 00:02:50.961 CC lib/trace/trace_flags.o 00:02:51.219 LIB libspdk_notify.a 00:02:51.219 SO libspdk_notify.so.6.0 00:02:51.219 LIB libspdk_keyring.a 00:02:51.219 LIB libspdk_trace.a 00:02:51.219 SYMLINK libspdk_notify.so 00:02:51.477 SO libspdk_keyring.so.1.0 00:02:51.477 SO libspdk_trace.so.10.0 00:02:51.477 SYMLINK libspdk_keyring.so 00:02:51.477 SYMLINK libspdk_trace.so 00:02:51.735 CC lib/sock/sock.o 00:02:51.735 CC lib/sock/sock_rpc.o 00:02:51.735 CC lib/thread/thread.o 00:02:51.735 CC lib/thread/iobuf.o 00:02:52.301 LIB libspdk_sock.a 00:02:52.301 SO libspdk_sock.so.10.0 00:02:52.301 SYMLINK libspdk_sock.so 00:02:52.560 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:52.560 CC lib/nvme/nvme_ctrlr.o 00:02:52.560 CC lib/nvme/nvme_fabric.o 00:02:52.560 CC lib/nvme/nvme_ns_cmd.o 00:02:52.560 CC lib/nvme/nvme_ns.o 00:02:52.560 CC lib/nvme/nvme_pcie_common.o 00:02:52.560 CC lib/nvme/nvme_pcie.o 00:02:52.560 CC lib/nvme/nvme_qpair.o 00:02:52.560 CC lib/nvme/nvme.o 00:02:52.560 CC lib/nvme/nvme_quirks.o 00:02:52.560 CC lib/nvme/nvme_transport.o 00:02:52.818 CC lib/nvme/nvme_discovery.o 00:02:52.818 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:52.818 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:52.818 CC lib/nvme/nvme_tcp.o 00:02:52.818 CC lib/nvme/nvme_opal.o 00:02:52.818 CC lib/nvme/nvme_io_msg.o 00:02:52.818 CC lib/nvme/nvme_poll_group.o 00:02:52.818 CC lib/nvme/nvme_stubs.o 00:02:52.818 CC lib/nvme/nvme_zns.o 00:02:52.818 CC lib/nvme/nvme_auth.o 00:02:52.818 CC lib/nvme/nvme_cuse.o 00:02:52.818 CC lib/nvme/nvme_rdma.o 00:02:53.384 LIB libspdk_thread.a 00:02:53.384 SO libspdk_thread.so.10.1 00:02:53.384 SYMLINK libspdk_thread.so 00:02:53.950 CC lib/virtio/virtio.o 00:02:53.950 CC lib/virtio/virtio_pci.o 00:02:53.950 CC lib/virtio/virtio_vhost_user.o 00:02:53.950 CC lib/virtio/virtio_vfio_user.o 00:02:53.950 CC lib/accel/accel.o 00:02:53.950 CC lib/accel/accel_rpc.o 00:02:53.950 CC lib/init/json_config.o 00:02:53.950 CC lib/accel/accel_sw.o 00:02:53.950 CC lib/init/subsystem_rpc.o 00:02:53.950 CC lib/init/subsystem.o 00:02:53.950 CC lib/init/rpc.o 00:02:53.950 CC lib/blob/blobstore.o 00:02:53.950 CC lib/blob/request.o 00:02:53.950 CC lib/blob/zeroes.o 00:02:53.950 CC lib/blob/blob_bs_dev.o 00:02:54.208 LIB libspdk_init.a 00:02:54.208 SO libspdk_init.so.5.0 00:02:54.208 LIB libspdk_virtio.a 00:02:54.208 SO libspdk_virtio.so.7.0 00:02:54.208 SYMLINK libspdk_init.so 00:02:54.465 SYMLINK libspdk_virtio.so 00:02:54.723 CC lib/event/app.o 00:02:54.723 CC lib/event/log_rpc.o 00:02:54.723 CC lib/event/reactor.o 00:02:54.723 CC lib/event/app_rpc.o 00:02:54.723 CC lib/event/scheduler_static.o 00:02:54.981 LIB libspdk_accel.a 00:02:54.981 LIB libspdk_nvme.a 00:02:54.981 SO libspdk_accel.so.15.1 00:02:54.981 SYMLINK libspdk_accel.so 00:02:54.981 SO libspdk_nvme.so.13.1 00:02:54.981 LIB libspdk_event.a 00:02:55.239 SO libspdk_event.so.14.0 00:02:55.239 SYMLINK libspdk_event.so 00:02:55.498 CC lib/bdev/bdev.o 00:02:55.498 CC lib/bdev/bdev_rpc.o 00:02:55.498 CC lib/bdev/part.o 00:02:55.498 CC lib/bdev/bdev_zone.o 00:02:55.498 CC lib/bdev/scsi_nvme.o 00:02:55.498 SYMLINK libspdk_nvme.so 00:02:56.875 LIB libspdk_blob.a 00:02:56.875 SO libspdk_blob.so.11.0 00:02:57.133 SYMLINK libspdk_blob.so 00:02:57.391 CC lib/lvol/lvol.o 00:02:57.391 CC lib/blobfs/blobfs.o 00:02:57.391 CC lib/blobfs/tree.o 00:02:57.958 LIB libspdk_bdev.a 00:02:57.958 LIB libspdk_blobfs.a 00:02:57.958 SO libspdk_blobfs.so.10.0 00:02:58.217 SO libspdk_bdev.so.15.1 00:02:58.217 SYMLINK libspdk_blobfs.so 00:02:58.217 SYMLINK libspdk_bdev.so 00:02:58.217 LIB libspdk_lvol.a 00:02:58.481 SO libspdk_lvol.so.10.0 00:02:58.481 SYMLINK libspdk_lvol.so 00:02:58.481 CC lib/nvmf/ctrlr.o 00:02:58.481 CC lib/nvmf/ctrlr_bdev.o 00:02:58.481 CC lib/nvmf/ctrlr_discovery.o 00:02:58.481 CC lib/nvmf/nvmf.o 00:02:58.481 CC lib/nvmf/subsystem.o 00:02:58.481 CC lib/nvmf/nvmf_rpc.o 00:02:58.481 CC lib/scsi/lun.o 00:02:58.481 CC lib/scsi/dev.o 00:02:58.481 CC lib/nvmf/transport.o 00:02:58.481 CC lib/scsi/scsi.o 00:02:58.481 CC lib/nvmf/tcp.o 00:02:58.481 CC lib/scsi/port.o 00:02:58.481 CC lib/nvmf/stubs.o 00:02:58.481 CC lib/scsi/scsi_bdev.o 00:02:58.481 CC lib/nvmf/mdns_server.o 00:02:58.481 CC lib/ftl/ftl_core.o 00:02:58.481 CC lib/scsi/scsi_pr.o 00:02:58.481 CC lib/ftl/ftl_init.o 00:02:58.481 CC lib/nvmf/rdma.o 00:02:58.481 CC lib/scsi/scsi_rpc.o 00:02:58.481 CC lib/nvmf/auth.o 00:02:58.481 CC lib/scsi/task.o 00:02:58.481 CC lib/ftl/ftl_layout.o 00:02:58.481 CC lib/ublk/ublk.o 00:02:58.481 CC lib/nbd/nbd.o 00:02:58.481 CC lib/ftl/ftl_debug.o 00:02:58.481 CC lib/nbd/nbd_rpc.o 00:02:58.481 CC lib/ftl/ftl_io.o 00:02:58.481 CC lib/ublk/ublk_rpc.o 00:02:58.481 CC lib/ftl/ftl_sb.o 00:02:58.482 CC lib/ftl/ftl_l2p.o 00:02:58.482 CC lib/ftl/ftl_l2p_flat.o 00:02:58.482 CC lib/ftl/ftl_band_ops.o 00:02:58.482 CC lib/ftl/ftl_nv_cache.o 00:02:58.482 CC lib/ftl/ftl_band.o 00:02:58.482 CC lib/ftl/ftl_writer.o 00:02:58.482 CC lib/ftl/ftl_rq.o 00:02:58.482 CC lib/ftl/ftl_l2p_cache.o 00:02:58.482 CC lib/ftl/ftl_reloc.o 00:02:58.482 CC lib/ftl/ftl_p2l.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:58.482 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:58.482 CC lib/ftl/utils/ftl_conf.o 00:02:58.482 CC lib/ftl/utils/ftl_md.o 00:02:58.482 CC lib/ftl/utils/ftl_bitmap.o 00:02:58.482 CC lib/ftl/utils/ftl_mempool.o 00:02:58.482 CC lib/ftl/utils/ftl_property.o 00:02:58.482 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:58.482 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:58.482 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:58.482 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:58.482 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:58.482 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:58.482 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:58.482 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:58.482 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:58.482 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:58.482 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:58.482 CC lib/ftl/base/ftl_base_bdev.o 00:02:58.482 CC lib/ftl/base/ftl_base_dev.o 00:02:58.482 CC lib/ftl/ftl_trace.o 00:02:59.418 LIB libspdk_scsi.a 00:02:59.418 LIB libspdk_nbd.a 00:02:59.418 SO libspdk_scsi.so.9.0 00:02:59.418 SO libspdk_nbd.so.7.0 00:02:59.418 SYMLINK libspdk_nbd.so 00:02:59.418 SYMLINK libspdk_scsi.so 00:02:59.418 LIB libspdk_ublk.a 00:02:59.418 SO libspdk_ublk.so.3.0 00:02:59.418 SYMLINK libspdk_ublk.so 00:02:59.675 LIB libspdk_ftl.a 00:02:59.675 CC lib/vhost/vhost.o 00:02:59.675 CC lib/iscsi/conn.o 00:02:59.675 CC lib/vhost/vhost_rpc.o 00:02:59.675 CC lib/vhost/vhost_scsi.o 00:02:59.675 CC lib/iscsi/init_grp.o 00:02:59.675 CC lib/vhost/vhost_blk.o 00:02:59.675 CC lib/iscsi/iscsi.o 00:02:59.675 CC lib/vhost/rte_vhost_user.o 00:02:59.675 CC lib/iscsi/portal_grp.o 00:02:59.675 CC lib/iscsi/md5.o 00:02:59.675 CC lib/iscsi/param.o 00:02:59.675 CC lib/iscsi/tgt_node.o 00:02:59.675 CC lib/iscsi/iscsi_subsystem.o 00:02:59.675 CC lib/iscsi/iscsi_rpc.o 00:02:59.675 CC lib/iscsi/task.o 00:02:59.675 SO libspdk_ftl.so.9.0 00:03:00.240 SYMLINK libspdk_ftl.so 00:03:00.807 LIB libspdk_nvmf.a 00:03:01.065 SO libspdk_nvmf.so.18.1 00:03:01.065 LIB libspdk_iscsi.a 00:03:01.324 SO libspdk_iscsi.so.8.0 00:03:01.324 SYMLINK libspdk_nvmf.so 00:03:01.324 SYMLINK libspdk_iscsi.so 00:03:01.891 LIB libspdk_vhost.a 00:03:02.150 SO libspdk_vhost.so.8.0 00:03:02.150 SYMLINK libspdk_vhost.so 00:03:02.716 CC module/env_dpdk/env_dpdk_rpc.o 00:03:02.975 CC module/accel/dsa/accel_dsa.o 00:03:02.975 CC module/accel/dsa/accel_dsa_rpc.o 00:03:02.975 CC module/accel/iaa/accel_iaa.o 00:03:02.975 CC module/accel/iaa/accel_iaa_rpc.o 00:03:02.975 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:03:02.975 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:03:02.975 CC module/accel/ioat/accel_ioat.o 00:03:02.975 CC module/accel/ioat/accel_ioat_rpc.o 00:03:02.975 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:03:02.975 CC module/accel/error/accel_error.o 00:03:02.975 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:03:02.975 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:02.975 CC module/accel/error/accel_error_rpc.o 00:03:02.975 CC module/sock/posix/posix.o 00:03:02.975 CC module/keyring/file/keyring.o 00:03:02.975 CC module/keyring/file/keyring_rpc.o 00:03:02.975 LIB libspdk_env_dpdk_rpc.a 00:03:02.975 CC module/scheduler/gscheduler/gscheduler.o 00:03:02.975 CC module/keyring/linux/keyring.o 00:03:02.975 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:02.975 CC module/keyring/linux/keyring_rpc.o 00:03:02.975 CC module/blob/bdev/blob_bdev.o 00:03:02.975 SO libspdk_env_dpdk_rpc.so.6.0 00:03:02.975 SYMLINK libspdk_env_dpdk_rpc.so 00:03:02.975 LIB libspdk_keyring_file.a 00:03:03.235 LIB libspdk_scheduler_gscheduler.a 00:03:03.235 LIB libspdk_keyring_linux.a 00:03:03.235 LIB libspdk_accel_ioat.a 00:03:03.235 LIB libspdk_accel_error.a 00:03:03.235 LIB libspdk_scheduler_dynamic.a 00:03:03.235 LIB libspdk_scheduler_dpdk_governor.a 00:03:03.235 SO libspdk_keyring_file.so.1.0 00:03:03.235 SO libspdk_scheduler_gscheduler.so.4.0 00:03:03.235 LIB libspdk_accel_iaa.a 00:03:03.235 SO libspdk_keyring_linux.so.1.0 00:03:03.235 LIB libspdk_accel_dsa.a 00:03:03.235 SO libspdk_accel_ioat.so.6.0 00:03:03.235 SO libspdk_accel_error.so.2.0 00:03:03.235 SO libspdk_scheduler_dynamic.so.4.0 00:03:03.235 SO libspdk_scheduler_dpdk_governor.so.4.0 00:03:03.235 SO libspdk_accel_iaa.so.3.0 00:03:03.235 SYMLINK libspdk_keyring_linux.so 00:03:03.235 SYMLINK libspdk_keyring_file.so 00:03:03.235 SYMLINK libspdk_scheduler_gscheduler.so 00:03:03.235 SYMLINK libspdk_scheduler_dpdk_governor.so 00:03:03.235 SO libspdk_accel_dsa.so.5.0 00:03:03.235 LIB libspdk_blob_bdev.a 00:03:03.235 SYMLINK libspdk_scheduler_dynamic.so 00:03:03.235 SYMLINK libspdk_accel_error.so 00:03:03.235 SYMLINK libspdk_accel_ioat.so 00:03:03.235 SYMLINK libspdk_accel_iaa.so 00:03:03.235 SO libspdk_blob_bdev.so.11.0 00:03:03.235 SYMLINK libspdk_accel_dsa.so 00:03:03.556 SYMLINK libspdk_blob_bdev.so 00:03:03.814 LIB libspdk_sock_posix.a 00:03:03.814 SO libspdk_sock_posix.so.6.0 00:03:03.814 CC module/bdev/delay/vbdev_delay.o 00:03:03.814 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:03.814 CC module/bdev/error/vbdev_error.o 00:03:03.814 CC module/bdev/error/vbdev_error_rpc.o 00:03:03.814 CC module/bdev/gpt/gpt.o 00:03:03.814 CC module/bdev/malloc/bdev_malloc.o 00:03:03.814 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:03.814 CC module/bdev/gpt/vbdev_gpt.o 00:03:03.814 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:03.814 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:03.814 CC module/bdev/aio/bdev_aio_rpc.o 00:03:03.814 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:03.814 CC module/bdev/split/vbdev_split.o 00:03:03.814 CC module/bdev/aio/bdev_aio.o 00:03:03.814 CC module/bdev/split/vbdev_split_rpc.o 00:03:03.814 CC module/bdev/lvol/vbdev_lvol.o 00:03:03.814 CC module/bdev/compress/vbdev_compress_rpc.o 00:03:03.814 CC module/bdev/raid/bdev_raid.o 00:03:03.814 CC module/bdev/compress/vbdev_compress.o 00:03:03.814 CC module/bdev/raid/bdev_raid_rpc.o 00:03:03.814 CC module/bdev/iscsi/bdev_iscsi.o 00:03:03.814 CC module/bdev/raid/raid1.o 00:03:03.814 CC module/bdev/raid/raid0.o 00:03:03.814 CC module/bdev/raid/bdev_raid_sb.o 00:03:03.814 CC module/bdev/passthru/vbdev_passthru.o 00:03:03.814 CC module/bdev/raid/concat.o 00:03:03.814 CC module/bdev/null/bdev_null.o 00:03:03.814 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:03.814 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:03.814 CC module/bdev/null/bdev_null_rpc.o 00:03:03.814 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:03.814 CC module/bdev/ftl/bdev_ftl.o 00:03:03.814 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:03.814 CC module/bdev/nvme/bdev_nvme.o 00:03:03.814 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:03.814 CC module/bdev/nvme/nvme_rpc.o 00:03:03.814 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:03.814 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:03.814 CC module/blobfs/bdev/blobfs_bdev.o 00:03:03.814 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:03.814 CC module/bdev/nvme/bdev_mdns_client.o 00:03:03.814 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:03.814 CC module/bdev/nvme/vbdev_opal.o 00:03:03.814 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:03.814 SYMLINK libspdk_sock_posix.so 00:03:03.814 CC module/bdev/crypto/vbdev_crypto.o 00:03:03.814 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:03:04.072 LIB libspdk_accel_dpdk_compressdev.a 00:03:04.072 SO libspdk_accel_dpdk_compressdev.so.3.0 00:03:04.072 LIB libspdk_blobfs_bdev.a 00:03:04.072 SYMLINK libspdk_accel_dpdk_compressdev.so 00:03:04.072 SO libspdk_blobfs_bdev.so.6.0 00:03:04.072 LIB libspdk_bdev_error.a 00:03:04.072 LIB libspdk_bdev_aio.a 00:03:04.072 LIB libspdk_bdev_split.a 00:03:04.072 LIB libspdk_bdev_gpt.a 00:03:04.331 SO libspdk_bdev_error.so.6.0 00:03:04.331 SO libspdk_bdev_aio.so.6.0 00:03:04.331 SO libspdk_bdev_split.so.6.0 00:03:04.331 SYMLINK libspdk_blobfs_bdev.so 00:03:04.331 SO libspdk_bdev_gpt.so.6.0 00:03:04.331 LIB libspdk_bdev_zone_block.a 00:03:04.331 SYMLINK libspdk_bdev_error.so 00:03:04.331 SYMLINK libspdk_bdev_aio.so 00:03:04.331 SO libspdk_bdev_zone_block.so.6.0 00:03:04.331 LIB libspdk_bdev_compress.a 00:03:04.331 LIB libspdk_accel_dpdk_cryptodev.a 00:03:04.331 SYMLINK libspdk_bdev_split.so 00:03:04.331 LIB libspdk_bdev_iscsi.a 00:03:04.331 LIB libspdk_bdev_null.a 00:03:04.331 LIB libspdk_bdev_passthru.a 00:03:04.331 LIB libspdk_bdev_ftl.a 00:03:04.331 SYMLINK libspdk_bdev_gpt.so 00:03:04.331 LIB libspdk_bdev_virtio.a 00:03:04.331 SO libspdk_bdev_compress.so.6.0 00:03:04.331 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:03:04.331 SO libspdk_bdev_null.so.6.0 00:03:04.331 SO libspdk_bdev_passthru.so.6.0 00:03:04.331 SO libspdk_bdev_iscsi.so.6.0 00:03:04.331 SO libspdk_bdev_ftl.so.6.0 00:03:04.331 SYMLINK libspdk_bdev_zone_block.so 00:03:04.331 SO libspdk_bdev_virtio.so.6.0 00:03:04.331 LIB libspdk_bdev_delay.a 00:03:04.331 SYMLINK libspdk_bdev_compress.so 00:03:04.331 LIB libspdk_bdev_crypto.a 00:03:04.331 SYMLINK libspdk_bdev_null.so 00:03:04.331 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:03:04.331 SYMLINK libspdk_bdev_iscsi.so 00:03:04.331 LIB libspdk_bdev_malloc.a 00:03:04.331 SYMLINK libspdk_bdev_passthru.so 00:03:04.331 SYMLINK libspdk_bdev_ftl.so 00:03:04.331 SO libspdk_bdev_delay.so.6.0 00:03:04.331 SYMLINK libspdk_bdev_virtio.so 00:03:04.590 SO libspdk_bdev_crypto.so.6.0 00:03:04.590 SO libspdk_bdev_malloc.so.6.0 00:03:04.590 SYMLINK libspdk_bdev_delay.so 00:03:04.590 SYMLINK libspdk_bdev_crypto.so 00:03:04.590 SYMLINK libspdk_bdev_malloc.so 00:03:04.590 LIB libspdk_bdev_lvol.a 00:03:04.590 SO libspdk_bdev_lvol.so.6.0 00:03:04.590 SYMLINK libspdk_bdev_lvol.so 00:03:04.590 LIB libspdk_bdev_raid.a 00:03:04.848 SO libspdk_bdev_raid.so.6.0 00:03:04.848 SYMLINK libspdk_bdev_raid.so 00:03:09.036 LIB libspdk_bdev_nvme.a 00:03:09.036 SO libspdk_bdev_nvme.so.7.0 00:03:09.036 SYMLINK libspdk_bdev_nvme.so 00:03:09.602 CC module/event/subsystems/iobuf/iobuf.o 00:03:09.602 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:09.602 CC module/event/subsystems/vmd/vmd.o 00:03:09.602 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:09.602 CC module/event/subsystems/scheduler/scheduler.o 00:03:09.602 CC module/event/subsystems/keyring/keyring.o 00:03:09.602 CC module/event/subsystems/sock/sock.o 00:03:09.602 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:09.602 LIB libspdk_event_sock.a 00:03:09.602 LIB libspdk_event_vmd.a 00:03:09.602 LIB libspdk_event_keyring.a 00:03:09.602 LIB libspdk_event_scheduler.a 00:03:09.602 LIB libspdk_event_vhost_blk.a 00:03:09.602 SO libspdk_event_sock.so.5.0 00:03:09.859 SO libspdk_event_keyring.so.1.0 00:03:09.859 SO libspdk_event_scheduler.so.4.0 00:03:09.859 SO libspdk_event_vmd.so.6.0 00:03:09.859 SO libspdk_event_vhost_blk.so.3.0 00:03:09.859 SYMLINK libspdk_event_sock.so 00:03:09.859 SYMLINK libspdk_event_scheduler.so 00:03:09.859 SYMLINK libspdk_event_keyring.so 00:03:09.859 SYMLINK libspdk_event_vhost_blk.so 00:03:09.859 SYMLINK libspdk_event_vmd.so 00:03:09.859 LIB libspdk_event_iobuf.a 00:03:09.859 SO libspdk_event_iobuf.so.3.0 00:03:09.859 SYMLINK libspdk_event_iobuf.so 00:03:10.424 CC module/event/subsystems/accel/accel.o 00:03:10.424 LIB libspdk_event_accel.a 00:03:10.424 SO libspdk_event_accel.so.6.0 00:03:10.682 SYMLINK libspdk_event_accel.so 00:03:10.940 CC module/event/subsystems/bdev/bdev.o 00:03:11.199 LIB libspdk_event_bdev.a 00:03:11.199 SO libspdk_event_bdev.so.6.0 00:03:11.199 SYMLINK libspdk_event_bdev.so 00:03:11.458 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:11.458 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:11.458 CC module/event/subsystems/nbd/nbd.o 00:03:11.458 CC module/event/subsystems/ublk/ublk.o 00:03:11.458 CC module/event/subsystems/scsi/scsi.o 00:03:11.716 LIB libspdk_event_nbd.a 00:03:11.716 SO libspdk_event_nbd.so.6.0 00:03:11.716 LIB libspdk_event_ublk.a 00:03:11.716 LIB libspdk_event_scsi.a 00:03:11.716 LIB libspdk_event_nvmf.a 00:03:11.716 SO libspdk_event_ublk.so.3.0 00:03:11.716 SYMLINK libspdk_event_nbd.so 00:03:11.716 SO libspdk_event_scsi.so.6.0 00:03:11.716 SO libspdk_event_nvmf.so.6.0 00:03:11.975 SYMLINK libspdk_event_ublk.so 00:03:11.975 SYMLINK libspdk_event_scsi.so 00:03:11.975 SYMLINK libspdk_event_nvmf.so 00:03:12.232 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:12.232 CC module/event/subsystems/iscsi/iscsi.o 00:03:12.490 LIB libspdk_event_vhost_scsi.a 00:03:12.490 LIB libspdk_event_iscsi.a 00:03:12.490 SO libspdk_event_vhost_scsi.so.3.0 00:03:12.490 SO libspdk_event_iscsi.so.6.0 00:03:12.490 SYMLINK libspdk_event_vhost_scsi.so 00:03:12.490 SYMLINK libspdk_event_iscsi.so 00:03:12.748 SO libspdk.so.6.0 00:03:12.748 SYMLINK libspdk.so 00:03:13.005 CC test/rpc_client/rpc_client_test.o 00:03:13.005 TEST_HEADER include/spdk/accel.h 00:03:13.005 TEST_HEADER include/spdk/accel_module.h 00:03:13.005 TEST_HEADER include/spdk/assert.h 00:03:13.005 CC app/trace_record/trace_record.o 00:03:13.005 TEST_HEADER include/spdk/barrier.h 00:03:13.005 TEST_HEADER include/spdk/base64.h 00:03:13.005 TEST_HEADER include/spdk/bdev_module.h 00:03:13.006 TEST_HEADER include/spdk/bdev_zone.h 00:03:13.006 TEST_HEADER include/spdk/bdev.h 00:03:13.006 TEST_HEADER include/spdk/bit_array.h 00:03:13.006 TEST_HEADER include/spdk/bit_pool.h 00:03:13.006 TEST_HEADER include/spdk/blob_bdev.h 00:03:13.006 CC app/spdk_lspci/spdk_lspci.o 00:03:13.006 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:13.006 TEST_HEADER include/spdk/blobfs.h 00:03:13.006 TEST_HEADER include/spdk/blob.h 00:03:13.006 TEST_HEADER include/spdk/conf.h 00:03:13.006 CXX app/trace/trace.o 00:03:13.006 TEST_HEADER include/spdk/config.h 00:03:13.006 TEST_HEADER include/spdk/crc16.h 00:03:13.006 TEST_HEADER include/spdk/cpuset.h 00:03:13.006 CC app/spdk_nvme_identify/identify.o 00:03:13.006 TEST_HEADER include/spdk/crc64.h 00:03:13.006 TEST_HEADER include/spdk/crc32.h 00:03:13.006 TEST_HEADER include/spdk/dif.h 00:03:13.006 TEST_HEADER include/spdk/endian.h 00:03:13.006 TEST_HEADER include/spdk/env_dpdk.h 00:03:13.006 TEST_HEADER include/spdk/dma.h 00:03:13.265 CC app/spdk_top/spdk_top.o 00:03:13.265 TEST_HEADER include/spdk/env.h 00:03:13.265 TEST_HEADER include/spdk/event.h 00:03:13.265 CC app/spdk_nvme_perf/perf.o 00:03:13.265 TEST_HEADER include/spdk/fd_group.h 00:03:13.265 TEST_HEADER include/spdk/fd.h 00:03:13.265 TEST_HEADER include/spdk/file.h 00:03:13.265 CC app/spdk_nvme_discover/discovery_aer.o 00:03:13.265 TEST_HEADER include/spdk/ftl.h 00:03:13.265 TEST_HEADER include/spdk/gpt_spec.h 00:03:13.265 TEST_HEADER include/spdk/hexlify.h 00:03:13.265 TEST_HEADER include/spdk/histogram_data.h 00:03:13.265 TEST_HEADER include/spdk/idxd.h 00:03:13.265 TEST_HEADER include/spdk/idxd_spec.h 00:03:13.265 TEST_HEADER include/spdk/ioat_spec.h 00:03:13.265 TEST_HEADER include/spdk/ioat.h 00:03:13.265 TEST_HEADER include/spdk/init.h 00:03:13.265 TEST_HEADER include/spdk/iscsi_spec.h 00:03:13.265 TEST_HEADER include/spdk/json.h 00:03:13.265 TEST_HEADER include/spdk/jsonrpc.h 00:03:13.265 TEST_HEADER include/spdk/keyring.h 00:03:13.265 TEST_HEADER include/spdk/keyring_module.h 00:03:13.266 TEST_HEADER include/spdk/likely.h 00:03:13.266 TEST_HEADER include/spdk/log.h 00:03:13.266 TEST_HEADER include/spdk/lvol.h 00:03:13.266 TEST_HEADER include/spdk/memory.h 00:03:13.266 TEST_HEADER include/spdk/nbd.h 00:03:13.266 TEST_HEADER include/spdk/mmio.h 00:03:13.266 TEST_HEADER include/spdk/notify.h 00:03:13.266 TEST_HEADER include/spdk/nvme.h 00:03:13.266 TEST_HEADER include/spdk/nvme_intel.h 00:03:13.266 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:13.266 TEST_HEADER include/spdk/nvme_spec.h 00:03:13.266 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:13.266 TEST_HEADER include/spdk/nvme_zns.h 00:03:13.266 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:13.266 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:13.266 TEST_HEADER include/spdk/nvmf.h 00:03:13.266 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:13.266 TEST_HEADER include/spdk/nvmf_spec.h 00:03:13.266 TEST_HEADER include/spdk/nvmf_transport.h 00:03:13.266 TEST_HEADER include/spdk/opal.h 00:03:13.266 TEST_HEADER include/spdk/pci_ids.h 00:03:13.266 CC app/spdk_dd/spdk_dd.o 00:03:13.266 TEST_HEADER include/spdk/pipe.h 00:03:13.266 TEST_HEADER include/spdk/opal_spec.h 00:03:13.266 TEST_HEADER include/spdk/queue.h 00:03:13.266 TEST_HEADER include/spdk/reduce.h 00:03:13.266 TEST_HEADER include/spdk/rpc.h 00:03:13.266 TEST_HEADER include/spdk/scheduler.h 00:03:13.266 TEST_HEADER include/spdk/scsi_spec.h 00:03:13.266 TEST_HEADER include/spdk/scsi.h 00:03:13.266 TEST_HEADER include/spdk/sock.h 00:03:13.266 TEST_HEADER include/spdk/stdinc.h 00:03:13.266 TEST_HEADER include/spdk/string.h 00:03:13.266 TEST_HEADER include/spdk/trace.h 00:03:13.266 TEST_HEADER include/spdk/thread.h 00:03:13.266 TEST_HEADER include/spdk/trace_parser.h 00:03:13.266 TEST_HEADER include/spdk/tree.h 00:03:13.266 TEST_HEADER include/spdk/ublk.h 00:03:13.266 TEST_HEADER include/spdk/util.h 00:03:13.266 TEST_HEADER include/spdk/uuid.h 00:03:13.266 TEST_HEADER include/spdk/version.h 00:03:13.266 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:13.266 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:13.266 CC app/iscsi_tgt/iscsi_tgt.o 00:03:13.266 TEST_HEADER include/spdk/vhost.h 00:03:13.266 TEST_HEADER include/spdk/vmd.h 00:03:13.266 TEST_HEADER include/spdk/xor.h 00:03:13.266 TEST_HEADER include/spdk/zipf.h 00:03:13.266 CXX test/cpp_headers/accel.o 00:03:13.266 CXX test/cpp_headers/accel_module.o 00:03:13.266 CXX test/cpp_headers/assert.o 00:03:13.266 CC app/nvmf_tgt/nvmf_main.o 00:03:13.266 CXX test/cpp_headers/base64.o 00:03:13.266 CXX test/cpp_headers/bdev.o 00:03:13.266 CXX test/cpp_headers/barrier.o 00:03:13.266 CXX test/cpp_headers/bdev_module.o 00:03:13.266 CXX test/cpp_headers/bit_array.o 00:03:13.266 CXX test/cpp_headers/bit_pool.o 00:03:13.266 CXX test/cpp_headers/blobfs_bdev.o 00:03:13.266 CXX test/cpp_headers/blob_bdev.o 00:03:13.266 CXX test/cpp_headers/blobfs.o 00:03:13.266 CXX test/cpp_headers/bdev_zone.o 00:03:13.266 CXX test/cpp_headers/blob.o 00:03:13.266 CXX test/cpp_headers/conf.o 00:03:13.266 CXX test/cpp_headers/config.o 00:03:13.266 CXX test/cpp_headers/cpuset.o 00:03:13.266 CXX test/cpp_headers/crc16.o 00:03:13.266 CXX test/cpp_headers/crc32.o 00:03:13.266 CXX test/cpp_headers/crc64.o 00:03:13.266 CXX test/cpp_headers/dif.o 00:03:13.266 CXX test/cpp_headers/endian.o 00:03:13.266 CXX test/cpp_headers/env_dpdk.o 00:03:13.266 CXX test/cpp_headers/dma.o 00:03:13.266 CXX test/cpp_headers/env.o 00:03:13.266 CXX test/cpp_headers/event.o 00:03:13.266 CXX test/cpp_headers/fd_group.o 00:03:13.266 CXX test/cpp_headers/fd.o 00:03:13.266 CXX test/cpp_headers/file.o 00:03:13.266 CXX test/cpp_headers/gpt_spec.o 00:03:13.266 CXX test/cpp_headers/histogram_data.o 00:03:13.266 CXX test/cpp_headers/hexlify.o 00:03:13.266 CXX test/cpp_headers/idxd.o 00:03:13.266 CXX test/cpp_headers/ftl.o 00:03:13.266 CXX test/cpp_headers/idxd_spec.o 00:03:13.266 CXX test/cpp_headers/init.o 00:03:13.266 CXX test/cpp_headers/ioat.o 00:03:13.266 CXX test/cpp_headers/ioat_spec.o 00:03:13.266 CXX test/cpp_headers/iscsi_spec.o 00:03:13.266 CXX test/cpp_headers/json.o 00:03:13.266 CXX test/cpp_headers/keyring.o 00:03:13.266 CXX test/cpp_headers/jsonrpc.o 00:03:13.266 CC app/spdk_tgt/spdk_tgt.o 00:03:13.266 CC test/env/memory/memory_ut.o 00:03:13.266 CC test/app/histogram_perf/histogram_perf.o 00:03:13.266 CC test/thread/poller_perf/poller_perf.o 00:03:13.266 CC test/env/pci/pci_ut.o 00:03:13.266 CC test/app/jsoncat/jsoncat.o 00:03:13.266 CC test/env/vtophys/vtophys.o 00:03:13.266 CC app/fio/nvme/fio_plugin.o 00:03:13.266 CC test/app/stub/stub.o 00:03:13.266 CC examples/ioat/verify/verify.o 00:03:13.266 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:13.266 CC examples/ioat/perf/perf.o 00:03:13.266 CC examples/util/zipf/zipf.o 00:03:13.266 CC test/dma/test_dma/test_dma.o 00:03:13.529 LINK spdk_lspci 00:03:13.529 CC test/app/bdev_svc/bdev_svc.o 00:03:13.529 LINK rpc_client_test 00:03:13.529 CC app/fio/bdev/fio_plugin.o 00:03:13.529 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:13.529 CC test/env/mem_callbacks/mem_callbacks.o 00:03:13.529 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:13.529 LINK spdk_trace_record 00:03:13.529 LINK histogram_perf 00:03:13.529 LINK interrupt_tgt 00:03:13.806 CXX test/cpp_headers/keyring_module.o 00:03:13.806 CXX test/cpp_headers/likely.o 00:03:13.806 LINK zipf 00:03:13.806 LINK env_dpdk_post_init 00:03:13.806 LINK vtophys 00:03:13.806 CXX test/cpp_headers/log.o 00:03:13.806 CXX test/cpp_headers/lvol.o 00:03:13.806 CXX test/cpp_headers/memory.o 00:03:13.806 CXX test/cpp_headers/mmio.o 00:03:13.806 LINK jsoncat 00:03:13.806 LINK stub 00:03:13.806 LINK iscsi_tgt 00:03:13.806 CXX test/cpp_headers/nbd.o 00:03:13.806 CXX test/cpp_headers/notify.o 00:03:13.806 CXX test/cpp_headers/nvme.o 00:03:13.806 CXX test/cpp_headers/nvme_intel.o 00:03:13.806 CXX test/cpp_headers/nvme_ocssd.o 00:03:13.806 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:13.806 CXX test/cpp_headers/nvme_spec.o 00:03:13.806 CXX test/cpp_headers/nvme_zns.o 00:03:13.806 LINK nvmf_tgt 00:03:13.806 CXX test/cpp_headers/nvmf_cmd.o 00:03:13.806 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:13.806 LINK poller_perf 00:03:13.806 CXX test/cpp_headers/nvmf.o 00:03:13.806 CXX test/cpp_headers/nvmf_spec.o 00:03:13.806 CXX test/cpp_headers/nvmf_transport.o 00:03:13.806 CXX test/cpp_headers/opal.o 00:03:13.806 LINK spdk_nvme_discover 00:03:13.806 LINK ioat_perf 00:03:13.806 LINK verify 00:03:13.806 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:13.806 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:13.806 CXX test/cpp_headers/opal_spec.o 00:03:13.806 CXX test/cpp_headers/pci_ids.o 00:03:13.806 CXX test/cpp_headers/pipe.o 00:03:13.806 LINK spdk_tgt 00:03:13.806 CXX test/cpp_headers/queue.o 00:03:13.806 CXX test/cpp_headers/reduce.o 00:03:13.806 CXX test/cpp_headers/rpc.o 00:03:13.806 CXX test/cpp_headers/scheduler.o 00:03:13.806 CXX test/cpp_headers/scsi.o 00:03:13.806 CXX test/cpp_headers/scsi_spec.o 00:03:13.806 CXX test/cpp_headers/sock.o 00:03:13.806 CXX test/cpp_headers/stdinc.o 00:03:13.806 CXX test/cpp_headers/string.o 00:03:13.806 CXX test/cpp_headers/thread.o 00:03:13.806 CXX test/cpp_headers/trace.o 00:03:13.806 CXX test/cpp_headers/trace_parser.o 00:03:13.806 CXX test/cpp_headers/tree.o 00:03:13.806 CXX test/cpp_headers/ublk.o 00:03:14.065 CXX test/cpp_headers/util.o 00:03:14.065 LINK spdk_dd 00:03:14.065 LINK bdev_svc 00:03:14.065 CXX test/cpp_headers/uuid.o 00:03:14.065 CXX test/cpp_headers/version.o 00:03:14.065 CXX test/cpp_headers/vfio_user_pci.o 00:03:14.065 CXX test/cpp_headers/vfio_user_spec.o 00:03:14.065 CXX test/cpp_headers/vhost.o 00:03:14.065 CXX test/cpp_headers/vmd.o 00:03:14.065 CXX test/cpp_headers/xor.o 00:03:14.065 CXX test/cpp_headers/zipf.o 00:03:14.323 LINK spdk_trace 00:03:14.323 LINK nvme_fuzz 00:03:14.323 LINK pci_ut 00:03:14.323 CC examples/idxd/perf/perf.o 00:03:14.323 CC examples/sock/hello_world/hello_sock.o 00:03:14.323 CC examples/vmd/lsvmd/lsvmd.o 00:03:14.323 CC examples/vmd/led/led.o 00:03:14.323 LINK spdk_bdev 00:03:14.323 CC examples/thread/thread/thread_ex.o 00:03:14.323 CC test/event/event_perf/event_perf.o 00:03:14.323 CC test/event/reactor/reactor.o 00:03:14.323 CC test/event/reactor_perf/reactor_perf.o 00:03:14.579 CC test/event/app_repeat/app_repeat.o 00:03:14.580 LINK spdk_nvme 00:03:14.580 LINK mem_callbacks 00:03:14.580 CC test/event/scheduler/scheduler.o 00:03:14.580 LINK spdk_nvme_identify 00:03:14.580 LINK led 00:03:14.580 LINK lsvmd 00:03:14.580 LINK vhost_fuzz 00:03:14.580 LINK spdk_top 00:03:14.580 LINK reactor 00:03:14.580 LINK reactor_perf 00:03:14.580 LINK event_perf 00:03:14.580 LINK hello_sock 00:03:14.580 LINK spdk_nvme_perf 00:03:14.580 LINK app_repeat 00:03:14.838 LINK scheduler 00:03:14.838 LINK idxd_perf 00:03:14.838 LINK thread 00:03:14.838 CC app/vhost/vhost.o 00:03:14.838 LINK memory_ut 00:03:14.838 LINK test_dma 00:03:15.095 LINK vhost 00:03:15.095 CC examples/nvme/arbitration/arbitration.o 00:03:15.095 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:15.095 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:15.095 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:15.095 CC examples/nvme/hotplug/hotplug.o 00:03:15.095 CC examples/nvme/abort/abort.o 00:03:15.095 CC examples/nvme/reconnect/reconnect.o 00:03:15.095 CC examples/nvme/hello_world/hello_world.o 00:03:15.353 CC examples/accel/perf/accel_perf.o 00:03:15.353 CC examples/blob/hello_world/hello_blob.o 00:03:15.353 CC test/nvme/reset/reset.o 00:03:15.353 CC test/nvme/reserve/reserve.o 00:03:15.353 CC test/nvme/simple_copy/simple_copy.o 00:03:15.353 CC test/nvme/startup/startup.o 00:03:15.353 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:15.353 CC examples/blob/cli/blobcli.o 00:03:15.353 CC test/nvme/aer/aer.o 00:03:15.353 CC test/nvme/overhead/overhead.o 00:03:15.353 CC test/nvme/sgl/sgl.o 00:03:15.353 CC test/nvme/err_injection/err_injection.o 00:03:15.353 CC test/nvme/cuse/cuse.o 00:03:15.353 CC test/nvme/connect_stress/connect_stress.o 00:03:15.353 CC test/nvme/e2edp/nvme_dp.o 00:03:15.353 CC test/nvme/fdp/fdp.o 00:03:15.353 CC test/nvme/fused_ordering/fused_ordering.o 00:03:15.353 CC test/nvme/compliance/nvme_compliance.o 00:03:15.353 CC test/nvme/boot_partition/boot_partition.o 00:03:15.353 CC test/blobfs/mkfs/mkfs.o 00:03:15.353 CC test/accel/dif/dif.o 00:03:15.353 LINK pmr_persistence 00:03:15.353 LINK hello_world 00:03:15.353 LINK cmb_copy 00:03:15.353 CC test/lvol/esnap/esnap.o 00:03:15.611 LINK hotplug 00:03:15.611 LINK abort 00:03:15.611 LINK arbitration 00:03:15.611 LINK simple_copy 00:03:15.611 LINK doorbell_aers 00:03:15.611 LINK startup 00:03:15.611 LINK boot_partition 00:03:15.611 LINK connect_stress 00:03:15.611 LINK err_injection 00:03:15.611 LINK reconnect 00:03:15.611 LINK fused_ordering 00:03:15.611 LINK iscsi_fuzz 00:03:15.611 LINK mkfs 00:03:15.611 LINK aer 00:03:15.611 LINK reserve 00:03:15.611 LINK reset 00:03:15.611 LINK hello_blob 00:03:15.611 LINK sgl 00:03:15.611 LINK nvme_dp 00:03:15.611 LINK overhead 00:03:15.611 LINK fdp 00:03:15.611 LINK nvme_manage 00:03:15.611 LINK nvme_compliance 00:03:15.869 LINK accel_perf 00:03:15.869 LINK dif 00:03:15.869 LINK blobcli 00:03:16.434 CC examples/bdev/hello_world/hello_bdev.o 00:03:16.434 CC examples/bdev/bdevperf/bdevperf.o 00:03:16.434 CC test/bdev/bdevio/bdevio.o 00:03:16.692 LINK hello_bdev 00:03:16.692 LINK cuse 00:03:16.950 LINK bdevio 00:03:17.517 LINK bdevperf 00:03:18.082 CC examples/nvmf/nvmf/nvmf.o 00:03:18.341 LINK nvmf 00:03:20.870 LINK esnap 00:03:20.870 00:03:20.870 real 1m34.095s 00:03:20.870 user 17m17.925s 00:03:20.870 sys 4m14.589s 00:03:20.870 10:29:56 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:03:20.870 10:29:56 make -- common/autotest_common.sh@10 -- $ set +x 00:03:20.870 ************************************ 00:03:20.870 END TEST make 00:03:20.870 ************************************ 00:03:20.870 10:29:56 -- common/autotest_common.sh@1142 -- $ return 0 00:03:20.870 10:29:56 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:20.870 10:29:56 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:20.870 10:29:56 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:20.870 10:29:56 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:20.870 10:29:56 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:20.870 10:29:56 -- pm/common@44 -- $ pid=1859367 00:03:20.870 10:29:56 -- pm/common@50 -- $ kill -TERM 1859367 00:03:20.870 10:29:56 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:20.870 10:29:56 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:20.870 10:29:56 -- pm/common@44 -- $ pid=1859369 00:03:20.870 10:29:56 -- pm/common@50 -- $ kill -TERM 1859369 00:03:20.870 10:29:56 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:20.870 10:29:56 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:20.870 10:29:56 -- pm/common@44 -- $ pid=1859371 00:03:20.870 10:29:56 -- pm/common@50 -- $ kill -TERM 1859371 00:03:20.870 10:29:56 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:20.870 10:29:56 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:20.870 10:29:56 -- pm/common@44 -- $ pid=1859394 00:03:20.870 10:29:56 -- pm/common@50 -- $ sudo -E kill -TERM 1859394 00:03:21.128 10:29:56 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:21.128 10:29:56 -- nvmf/common.sh@7 -- # uname -s 00:03:21.128 10:29:56 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:21.128 10:29:56 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:21.128 10:29:56 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:21.128 10:29:56 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:21.128 10:29:56 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:21.128 10:29:56 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:21.128 10:29:56 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:21.128 10:29:56 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:21.128 10:29:56 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:21.128 10:29:56 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:21.128 10:29:56 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:03:21.128 10:29:56 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:03:21.128 10:29:56 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:21.128 10:29:56 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:21.128 10:29:56 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:21.128 10:29:56 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:21.128 10:29:56 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:21.128 10:29:56 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:21.128 10:29:56 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:21.128 10:29:56 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:21.128 10:29:56 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:21.128 10:29:56 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:21.128 10:29:56 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:21.128 10:29:56 -- paths/export.sh@5 -- # export PATH 00:03:21.128 10:29:56 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:21.128 10:29:56 -- nvmf/common.sh@47 -- # : 0 00:03:21.128 10:29:56 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:21.128 10:29:56 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:21.128 10:29:56 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:21.128 10:29:56 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:21.128 10:29:56 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:21.128 10:29:56 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:21.128 10:29:56 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:21.128 10:29:56 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:21.128 10:29:56 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:21.128 10:29:56 -- spdk/autotest.sh@32 -- # uname -s 00:03:21.128 10:29:56 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:21.128 10:29:56 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:21.128 10:29:56 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:21.128 10:29:56 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:21.128 10:29:56 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:21.128 10:29:56 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:21.128 10:29:56 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:21.128 10:29:56 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:21.128 10:29:56 -- spdk/autotest.sh@48 -- # udevadm_pid=1926201 00:03:21.128 10:29:56 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:21.128 10:29:56 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:21.128 10:29:56 -- pm/common@17 -- # local monitor 00:03:21.128 10:29:56 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:21.128 10:29:56 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:21.128 10:29:56 -- pm/common@21 -- # date +%s 00:03:21.128 10:29:56 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:21.128 10:29:56 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:21.128 10:29:56 -- pm/common@21 -- # date +%s 00:03:21.128 10:29:56 -- pm/common@25 -- # sleep 1 00:03:21.128 10:29:56 -- pm/common@21 -- # date +%s 00:03:21.128 10:29:56 -- pm/common@21 -- # date +%s 00:03:21.128 10:29:56 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720772996 00:03:21.128 10:29:56 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720772996 00:03:21.128 10:29:56 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720772996 00:03:21.128 10:29:56 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720772996 00:03:21.128 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720772996_collect-vmstat.pm.log 00:03:21.128 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720772996_collect-cpu-load.pm.log 00:03:21.128 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720772996_collect-cpu-temp.pm.log 00:03:21.128 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720772996_collect-bmc-pm.bmc.pm.log 00:03:22.088 10:29:57 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:22.088 10:29:57 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:22.088 10:29:57 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:22.088 10:29:57 -- common/autotest_common.sh@10 -- # set +x 00:03:22.088 10:29:57 -- spdk/autotest.sh@59 -- # create_test_list 00:03:22.088 10:29:57 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:22.088 10:29:57 -- common/autotest_common.sh@10 -- # set +x 00:03:22.344 10:29:57 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:22.344 10:29:57 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:22.344 10:29:57 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:22.344 10:29:57 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:22.344 10:29:57 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:22.344 10:29:57 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:22.344 10:29:57 -- common/autotest_common.sh@1455 -- # uname 00:03:22.344 10:29:57 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:22.344 10:29:57 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:22.344 10:29:57 -- common/autotest_common.sh@1475 -- # uname 00:03:22.344 10:29:57 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:22.344 10:29:57 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:22.345 10:29:57 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:22.345 10:29:57 -- spdk/autotest.sh@72 -- # hash lcov 00:03:22.345 10:29:57 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:22.345 10:29:57 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:22.345 --rc lcov_branch_coverage=1 00:03:22.345 --rc lcov_function_coverage=1 00:03:22.345 --rc genhtml_branch_coverage=1 00:03:22.345 --rc genhtml_function_coverage=1 00:03:22.345 --rc genhtml_legend=1 00:03:22.345 --rc geninfo_all_blocks=1 00:03:22.345 ' 00:03:22.345 10:29:57 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:22.345 --rc lcov_branch_coverage=1 00:03:22.345 --rc lcov_function_coverage=1 00:03:22.345 --rc genhtml_branch_coverage=1 00:03:22.345 --rc genhtml_function_coverage=1 00:03:22.345 --rc genhtml_legend=1 00:03:22.345 --rc geninfo_all_blocks=1 00:03:22.345 ' 00:03:22.345 10:29:57 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:22.345 --rc lcov_branch_coverage=1 00:03:22.345 --rc lcov_function_coverage=1 00:03:22.345 --rc genhtml_branch_coverage=1 00:03:22.345 --rc genhtml_function_coverage=1 00:03:22.345 --rc genhtml_legend=1 00:03:22.345 --rc geninfo_all_blocks=1 00:03:22.345 --no-external' 00:03:22.345 10:29:57 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:22.345 --rc lcov_branch_coverage=1 00:03:22.345 --rc lcov_function_coverage=1 00:03:22.345 --rc genhtml_branch_coverage=1 00:03:22.345 --rc genhtml_function_coverage=1 00:03:22.345 --rc genhtml_legend=1 00:03:22.345 --rc geninfo_all_blocks=1 00:03:22.345 --no-external' 00:03:22.345 10:29:57 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:22.345 lcov: LCOV version 1.14 00:03:22.345 10:29:57 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:27.651 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:27.651 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:27.652 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:27.652 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:27.652 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:27.652 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:27.652 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:27.912 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:27.912 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:28.173 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:28.173 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:50.085 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:50.085 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:56.638 10:30:31 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:56.638 10:30:31 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:56.638 10:30:31 -- common/autotest_common.sh@10 -- # set +x 00:03:56.638 10:30:31 -- spdk/autotest.sh@91 -- # rm -f 00:03:56.638 10:30:31 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:59.919 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:59.919 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:59.919 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:59.919 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:00.177 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:00.178 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:00.178 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:00.178 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:00.178 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:00.178 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:00.178 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:00.178 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:00.178 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:00.178 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:00.436 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:00.436 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:00.436 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:00.436 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:00.436 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:00.436 10:30:35 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:04:00.436 10:30:35 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:00.436 10:30:35 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:00.436 10:30:35 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:00.436 10:30:35 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:00.436 10:30:35 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:00.436 10:30:35 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:00.436 10:30:35 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:00.436 10:30:35 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:00.436 10:30:35 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:04:00.436 10:30:35 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:04:00.436 10:30:35 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:04:00.436 10:30:35 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:04:00.436 10:30:35 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:04:00.436 10:30:35 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:00.436 No valid GPT data, bailing 00:04:00.436 10:30:35 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:00.436 10:30:35 -- scripts/common.sh@391 -- # pt= 00:04:00.436 10:30:35 -- scripts/common.sh@392 -- # return 1 00:04:00.436 10:30:35 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:00.436 1+0 records in 00:04:00.436 1+0 records out 00:04:00.436 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00230339 s, 455 MB/s 00:04:00.436 10:30:35 -- spdk/autotest.sh@118 -- # sync 00:04:00.436 10:30:35 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:00.436 10:30:35 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:00.436 10:30:35 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:05.702 10:30:40 -- spdk/autotest.sh@124 -- # uname -s 00:04:05.702 10:30:40 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:04:05.702 10:30:40 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:05.702 10:30:40 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:05.702 10:30:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:05.702 10:30:40 -- common/autotest_common.sh@10 -- # set +x 00:04:05.702 ************************************ 00:04:05.702 START TEST setup.sh 00:04:05.702 ************************************ 00:04:05.702 10:30:40 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:05.702 * Looking for test storage... 00:04:05.702 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:05.702 10:30:40 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:05.702 10:30:40 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:05.702 10:30:40 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:05.702 10:30:40 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:05.702 10:30:40 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:05.702 10:30:40 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:05.702 ************************************ 00:04:05.702 START TEST acl 00:04:05.702 ************************************ 00:04:05.702 10:30:40 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:05.702 * Looking for test storage... 00:04:05.702 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:05.702 10:30:40 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:05.702 10:30:40 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:05.702 10:30:40 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:05.702 10:30:40 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:05.702 10:30:40 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:05.702 10:30:40 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:05.702 10:30:40 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:05.702 10:30:40 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:05.702 10:30:40 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:05.702 10:30:40 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:05.702 10:30:40 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:05.702 10:30:40 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:05.702 10:30:40 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:05.702 10:30:40 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:05.702 10:30:40 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:05.702 10:30:40 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:09.883 10:30:44 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:09.883 10:30:44 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:09.883 10:30:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:09.883 10:30:44 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:09.883 10:30:44 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:09.883 10:30:44 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:13.219 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:13.219 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.219 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.219 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:13.219 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 Hugepages 00:04:13.220 node hugesize free / total 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 00:04:13.220 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:13.220 10:30:48 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:13.220 10:30:48 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:13.220 10:30:48 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:13.220 10:30:48 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:13.479 ************************************ 00:04:13.479 START TEST denied 00:04:13.479 ************************************ 00:04:13.479 10:30:48 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:04:13.479 10:30:48 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:04:13.479 10:30:48 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:13.479 10:30:48 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:04:13.479 10:30:48 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.479 10:30:48 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:17.671 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:17.671 10:30:52 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:21.866 00:04:21.866 real 0m8.311s 00:04:21.866 user 0m2.602s 00:04:21.866 sys 0m4.939s 00:04:21.866 10:30:56 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:21.866 10:30:56 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:21.866 ************************************ 00:04:21.866 END TEST denied 00:04:21.866 ************************************ 00:04:21.866 10:30:56 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:21.866 10:30:56 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:21.866 10:30:56 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:21.866 10:30:56 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:21.866 10:30:56 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:21.866 ************************************ 00:04:21.866 START TEST allowed 00:04:21.866 ************************************ 00:04:21.866 10:30:56 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:04:21.866 10:30:56 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:21.866 10:30:56 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:21.866 10:30:56 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:21.866 10:30:56 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.866 10:30:56 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:28.437 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:28.437 10:31:03 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:28.437 10:31:03 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:28.437 10:31:03 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:28.437 10:31:03 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:28.437 10:31:03 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:31.721 00:04:31.721 real 0m10.019s 00:04:31.721 user 0m2.593s 00:04:31.721 sys 0m4.908s 00:04:31.721 10:31:06 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:31.721 10:31:06 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:31.721 ************************************ 00:04:31.721 END TEST allowed 00:04:31.721 ************************************ 00:04:31.721 10:31:06 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:31.721 00:04:31.721 real 0m26.526s 00:04:31.721 user 0m8.240s 00:04:31.721 sys 0m15.310s 00:04:31.721 10:31:06 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:31.721 10:31:06 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:31.721 ************************************ 00:04:31.721 END TEST acl 00:04:31.721 ************************************ 00:04:31.721 10:31:06 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:31.721 10:31:06 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:31.721 10:31:06 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:31.721 10:31:06 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:31.721 10:31:06 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:31.981 ************************************ 00:04:31.981 START TEST hugepages 00:04:31.981 ************************************ 00:04:31.981 10:31:06 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:31.981 * Looking for test storage... 00:04:31.981 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 76740908 kB' 'MemAvailable: 80039888 kB' 'Buffers: 12176 kB' 'Cached: 9470236 kB' 'SwapCached: 0 kB' 'Active: 6539936 kB' 'Inactive: 3457048 kB' 'Active(anon): 6146152 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517956 kB' 'Mapped: 182208 kB' 'Shmem: 5631580 kB' 'KReclaimable: 203132 kB' 'Slab: 531420 kB' 'SReclaimable: 203132 kB' 'SUnreclaim: 328288 kB' 'KernelStack: 16160 kB' 'PageTables: 8000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438200 kB' 'Committed_AS: 7514964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200808 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.981 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.982 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:31.983 10:31:07 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:31.983 10:31:07 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:31.983 10:31:07 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:31.983 10:31:07 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:31.983 ************************************ 00:04:31.983 START TEST default_setup 00:04:31.983 ************************************ 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.983 10:31:07 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:35.262 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:35.262 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:35.262 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:35.262 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:37.794 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78885536 kB' 'MemAvailable: 82184468 kB' 'Buffers: 12176 kB' 'Cached: 9470356 kB' 'SwapCached: 0 kB' 'Active: 6556100 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162316 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533904 kB' 'Mapped: 182368 kB' 'Shmem: 5631700 kB' 'KReclaimable: 203036 kB' 'Slab: 530056 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327020 kB' 'KernelStack: 16320 kB' 'PageTables: 8228 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7534748 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.794 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:37.795 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78886212 kB' 'MemAvailable: 82185144 kB' 'Buffers: 12176 kB' 'Cached: 9470360 kB' 'SwapCached: 0 kB' 'Active: 6556296 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162512 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534112 kB' 'Mapped: 182352 kB' 'Shmem: 5631704 kB' 'KReclaimable: 203036 kB' 'Slab: 530040 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327004 kB' 'KernelStack: 16368 kB' 'PageTables: 8892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7537116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201112 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.796 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78885332 kB' 'MemAvailable: 82184264 kB' 'Buffers: 12176 kB' 'Cached: 9470376 kB' 'SwapCached: 0 kB' 'Active: 6556356 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162572 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534136 kB' 'Mapped: 182352 kB' 'Shmem: 5631720 kB' 'KReclaimable: 203036 kB' 'Slab: 530032 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 326996 kB' 'KernelStack: 16240 kB' 'PageTables: 8240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7534788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.797 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.798 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.056 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:12 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:38.057 nr_hugepages=1024 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:38.057 resv_hugepages=0 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:38.057 surplus_hugepages=0 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:38.057 anon_hugepages=0 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78886204 kB' 'MemAvailable: 82185136 kB' 'Buffers: 12176 kB' 'Cached: 9470400 kB' 'SwapCached: 0 kB' 'Active: 6556084 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162300 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533924 kB' 'Mapped: 182344 kB' 'Shmem: 5631744 kB' 'KReclaimable: 203036 kB' 'Slab: 530096 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327060 kB' 'KernelStack: 16240 kB' 'PageTables: 8264 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7534564 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.057 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 36665368 kB' 'MemUsed: 11404516 kB' 'SwapCached: 0 kB' 'Active: 5272100 kB' 'Inactive: 3294260 kB' 'Active(anon): 5016840 kB' 'Inactive(anon): 0 kB' 'Active(file): 255260 kB' 'Inactive(file): 3294260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8278132 kB' 'Mapped: 118280 kB' 'AnonPages: 291384 kB' 'Shmem: 4728612 kB' 'KernelStack: 9080 kB' 'PageTables: 4580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 98392 kB' 'Slab: 283328 kB' 'SReclaimable: 98392 kB' 'SUnreclaim: 184936 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.058 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:38.059 node0=1024 expecting 1024 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:38.059 00:04:38.059 real 0m5.927s 00:04:38.059 user 0m1.189s 00:04:38.059 sys 0m2.145s 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:38.059 10:31:13 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:38.059 ************************************ 00:04:38.059 END TEST default_setup 00:04:38.059 ************************************ 00:04:38.059 10:31:13 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:38.059 10:31:13 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:38.059 10:31:13 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:38.059 10:31:13 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:38.059 10:31:13 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:38.059 ************************************ 00:04:38.059 START TEST per_node_1G_alloc 00:04:38.059 ************************************ 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:38.059 10:31:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:41.336 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:41.336 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:41.598 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:41.598 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:41.598 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78878808 kB' 'MemAvailable: 82177740 kB' 'Buffers: 12176 kB' 'Cached: 9470492 kB' 'SwapCached: 0 kB' 'Active: 6554432 kB' 'Inactive: 3457048 kB' 'Active(anon): 6160648 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531540 kB' 'Mapped: 181368 kB' 'Shmem: 5631836 kB' 'KReclaimable: 203036 kB' 'Slab: 530020 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 326984 kB' 'KernelStack: 16128 kB' 'PageTables: 8092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7521600 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.598 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78879176 kB' 'MemAvailable: 82178108 kB' 'Buffers: 12176 kB' 'Cached: 9470496 kB' 'SwapCached: 0 kB' 'Active: 6554808 kB' 'Inactive: 3457048 kB' 'Active(anon): 6161024 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531960 kB' 'Mapped: 181368 kB' 'Shmem: 5631840 kB' 'KReclaimable: 203036 kB' 'Slab: 530020 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 326984 kB' 'KernelStack: 16144 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7521620 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.599 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.600 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78879264 kB' 'MemAvailable: 82178196 kB' 'Buffers: 12176 kB' 'Cached: 9470512 kB' 'SwapCached: 0 kB' 'Active: 6554332 kB' 'Inactive: 3457048 kB' 'Active(anon): 6160548 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531932 kB' 'Mapped: 181292 kB' 'Shmem: 5631856 kB' 'KReclaimable: 203036 kB' 'Slab: 530004 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 326968 kB' 'KernelStack: 16144 kB' 'PageTables: 8140 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7521644 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200888 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.601 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.905 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.906 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:41.907 nr_hugepages=1024 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:41.907 resv_hugepages=0 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:41.907 surplus_hugepages=0 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:41.907 anon_hugepages=0 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78890932 kB' 'MemAvailable: 82189864 kB' 'Buffers: 12176 kB' 'Cached: 9470536 kB' 'SwapCached: 0 kB' 'Active: 6554560 kB' 'Inactive: 3457048 kB' 'Active(anon): 6160776 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532148 kB' 'Mapped: 181288 kB' 'Shmem: 5631880 kB' 'KReclaimable: 203036 kB' 'Slab: 530004 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 326968 kB' 'KernelStack: 16144 kB' 'PageTables: 8192 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7522764 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.907 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.908 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 37738472 kB' 'MemUsed: 10331412 kB' 'SwapCached: 0 kB' 'Active: 5273436 kB' 'Inactive: 3294260 kB' 'Active(anon): 5018176 kB' 'Inactive(anon): 0 kB' 'Active(file): 255260 kB' 'Inactive(file): 3294260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8278164 kB' 'Mapped: 118024 kB' 'AnonPages: 292712 kB' 'Shmem: 4728644 kB' 'KernelStack: 8936 kB' 'PageTables: 4600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 98392 kB' 'Slab: 283256 kB' 'SReclaimable: 98392 kB' 'SUnreclaim: 184864 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.909 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.910 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223616 kB' 'MemFree: 41161068 kB' 'MemUsed: 3062548 kB' 'SwapCached: 0 kB' 'Active: 1281232 kB' 'Inactive: 162788 kB' 'Active(anon): 1142708 kB' 'Inactive(anon): 0 kB' 'Active(file): 138524 kB' 'Inactive(file): 162788 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1204592 kB' 'Mapped: 63280 kB' 'AnonPages: 238956 kB' 'Shmem: 903280 kB' 'KernelStack: 7176 kB' 'PageTables: 3488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104644 kB' 'Slab: 246748 kB' 'SReclaimable: 104644 kB' 'SUnreclaim: 142104 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.911 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:41.912 node0=512 expecting 512 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:41.912 node1=512 expecting 512 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:41.912 00:04:41.912 real 0m3.739s 00:04:41.912 user 0m1.356s 00:04:41.912 sys 0m2.448s 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:41.912 10:31:16 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:41.912 ************************************ 00:04:41.912 END TEST per_node_1G_alloc 00:04:41.912 ************************************ 00:04:41.912 10:31:16 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:41.912 10:31:16 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:41.912 10:31:16 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:41.912 10:31:16 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:41.912 10:31:16 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:41.912 ************************************ 00:04:41.912 START TEST even_2G_alloc 00:04:41.912 ************************************ 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:41.912 10:31:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:46.107 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:46.107 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:46.107 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:46.108 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:46.108 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78891120 kB' 'MemAvailable: 82190052 kB' 'Buffers: 12176 kB' 'Cached: 9470644 kB' 'SwapCached: 0 kB' 'Active: 6555988 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162204 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533544 kB' 'Mapped: 181380 kB' 'Shmem: 5631988 kB' 'KReclaimable: 203036 kB' 'Slab: 530292 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327256 kB' 'KernelStack: 16128 kB' 'PageTables: 8084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7522140 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.108 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78890656 kB' 'MemAvailable: 82189588 kB' 'Buffers: 12176 kB' 'Cached: 9470648 kB' 'SwapCached: 0 kB' 'Active: 6555800 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162016 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533312 kB' 'Mapped: 181300 kB' 'Shmem: 5631992 kB' 'KReclaimable: 203036 kB' 'Slab: 530284 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327248 kB' 'KernelStack: 16096 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7521788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.109 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.110 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78890720 kB' 'MemAvailable: 82189652 kB' 'Buffers: 12176 kB' 'Cached: 9470648 kB' 'SwapCached: 0 kB' 'Active: 6555000 kB' 'Inactive: 3457048 kB' 'Active(anon): 6161216 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532492 kB' 'Mapped: 181360 kB' 'Shmem: 5631992 kB' 'KReclaimable: 203036 kB' 'Slab: 530284 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327248 kB' 'KernelStack: 16064 kB' 'PageTables: 7836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7521812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.111 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.112 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:46.113 nr_hugepages=1024 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:46.113 resv_hugepages=0 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:46.113 surplus_hugepages=0 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:46.113 anon_hugepages=0 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78891444 kB' 'MemAvailable: 82190376 kB' 'Buffers: 12176 kB' 'Cached: 9470704 kB' 'SwapCached: 0 kB' 'Active: 6555148 kB' 'Inactive: 3457048 kB' 'Active(anon): 6161364 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532604 kB' 'Mapped: 181360 kB' 'Shmem: 5632048 kB' 'KReclaimable: 203036 kB' 'Slab: 530284 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327248 kB' 'KernelStack: 16048 kB' 'PageTables: 7784 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7522956 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.113 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.114 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 37747148 kB' 'MemUsed: 10322736 kB' 'SwapCached: 0 kB' 'Active: 5275436 kB' 'Inactive: 3294260 kB' 'Active(anon): 5020176 kB' 'Inactive(anon): 0 kB' 'Active(file): 255260 kB' 'Inactive(file): 3294260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8278192 kB' 'Mapped: 118096 kB' 'AnonPages: 294760 kB' 'Shmem: 4728672 kB' 'KernelStack: 8888 kB' 'PageTables: 4364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 98392 kB' 'Slab: 283548 kB' 'SReclaimable: 98392 kB' 'SUnreclaim: 185156 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.115 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223616 kB' 'MemFree: 41143416 kB' 'MemUsed: 3080200 kB' 'SwapCached: 0 kB' 'Active: 1280596 kB' 'Inactive: 162788 kB' 'Active(anon): 1142072 kB' 'Inactive(anon): 0 kB' 'Active(file): 138524 kB' 'Inactive(file): 162788 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1204716 kB' 'Mapped: 63280 kB' 'AnonPages: 238724 kB' 'Shmem: 903404 kB' 'KernelStack: 7160 kB' 'PageTables: 3428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104644 kB' 'Slab: 246736 kB' 'SReclaimable: 104644 kB' 'SUnreclaim: 142092 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.116 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.117 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:46.118 node0=512 expecting 512 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:46.118 node1=512 expecting 512 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:46.118 00:04:46.118 real 0m3.920s 00:04:46.118 user 0m1.529s 00:04:46.118 sys 0m2.494s 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:46.118 10:31:20 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:46.118 ************************************ 00:04:46.118 END TEST even_2G_alloc 00:04:46.118 ************************************ 00:04:46.118 10:31:20 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:46.118 10:31:20 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:46.118 10:31:20 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:46.118 10:31:20 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:46.118 10:31:20 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:46.118 ************************************ 00:04:46.118 START TEST odd_alloc 00:04:46.118 ************************************ 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.118 10:31:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:49.401 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:49.401 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:49.401 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:49.401 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:49.401 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:49.401 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:49.401 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:49.401 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:49.401 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:49.401 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:49.401 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:49.401 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:49.664 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:49.664 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:49.664 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:49.664 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:49.664 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:49.664 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:49.664 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78878720 kB' 'MemAvailable: 82177652 kB' 'Buffers: 12176 kB' 'Cached: 9470800 kB' 'SwapCached: 0 kB' 'Active: 6555884 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162100 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533156 kB' 'Mapped: 181312 kB' 'Shmem: 5632144 kB' 'KReclaimable: 203036 kB' 'Slab: 530376 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327340 kB' 'KernelStack: 16048 kB' 'PageTables: 7816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485752 kB' 'Committed_AS: 7522840 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.664 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78879624 kB' 'MemAvailable: 82178556 kB' 'Buffers: 12176 kB' 'Cached: 9470804 kB' 'SwapCached: 0 kB' 'Active: 6556164 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162380 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533492 kB' 'Mapped: 181312 kB' 'Shmem: 5632148 kB' 'KReclaimable: 203036 kB' 'Slab: 530420 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327384 kB' 'KernelStack: 16048 kB' 'PageTables: 7844 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485752 kB' 'Committed_AS: 7522856 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.665 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.666 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78879372 kB' 'MemAvailable: 82178304 kB' 'Buffers: 12176 kB' 'Cached: 9470820 kB' 'SwapCached: 0 kB' 'Active: 6556312 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162528 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533660 kB' 'Mapped: 181312 kB' 'Shmem: 5632164 kB' 'KReclaimable: 203036 kB' 'Slab: 530420 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327384 kB' 'KernelStack: 16032 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485752 kB' 'Committed_AS: 7523992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.667 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.668 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:49.669 nr_hugepages=1025 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:49.669 resv_hugepages=0 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:49.669 surplus_hugepages=0 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:49.669 anon_hugepages=0 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78877864 kB' 'MemAvailable: 82176796 kB' 'Buffers: 12176 kB' 'Cached: 9470840 kB' 'SwapCached: 0 kB' 'Active: 6556220 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162436 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533564 kB' 'Mapped: 181328 kB' 'Shmem: 5632184 kB' 'KReclaimable: 203036 kB' 'Slab: 530420 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327384 kB' 'KernelStack: 16112 kB' 'PageTables: 7656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485752 kB' 'Committed_AS: 7525496 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.669 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.670 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.931 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 37745864 kB' 'MemUsed: 10324020 kB' 'SwapCached: 0 kB' 'Active: 5276148 kB' 'Inactive: 3294260 kB' 'Active(anon): 5020888 kB' 'Inactive(anon): 0 kB' 'Active(file): 255260 kB' 'Inactive(file): 3294260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8278228 kB' 'Mapped: 118048 kB' 'AnonPages: 295336 kB' 'Shmem: 4728708 kB' 'KernelStack: 8968 kB' 'PageTables: 4468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 98392 kB' 'Slab: 283800 kB' 'SReclaimable: 98392 kB' 'SUnreclaim: 185408 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.932 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223616 kB' 'MemFree: 41130752 kB' 'MemUsed: 3092864 kB' 'SwapCached: 0 kB' 'Active: 1280228 kB' 'Inactive: 162788 kB' 'Active(anon): 1141704 kB' 'Inactive(anon): 0 kB' 'Active(file): 138524 kB' 'Inactive(file): 162788 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1204816 kB' 'Mapped: 63280 kB' 'AnonPages: 238308 kB' 'Shmem: 903504 kB' 'KernelStack: 7144 kB' 'PageTables: 3420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104644 kB' 'Slab: 246620 kB' 'SReclaimable: 104644 kB' 'SUnreclaim: 141976 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.933 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:49.934 node0=512 expecting 513 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:49.934 node1=513 expecting 512 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:49.934 00:04:49.934 real 0m3.923s 00:04:49.934 user 0m1.529s 00:04:49.934 sys 0m2.499s 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:49.934 10:31:24 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:49.934 ************************************ 00:04:49.934 END TEST odd_alloc 00:04:49.934 ************************************ 00:04:49.934 10:31:24 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:49.934 10:31:24 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:49.934 10:31:24 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:49.934 10:31:24 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:49.934 10:31:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:49.934 ************************************ 00:04:49.934 START TEST custom_alloc 00:04:49.934 ************************************ 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:49.934 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:49.935 10:31:25 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:54.126 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:54.126 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:54.126 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:54.126 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:54.126 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 77852872 kB' 'MemAvailable: 81151804 kB' 'Buffers: 12176 kB' 'Cached: 9470952 kB' 'SwapCached: 0 kB' 'Active: 6557824 kB' 'Inactive: 3457048 kB' 'Active(anon): 6164040 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534444 kB' 'Mapped: 181432 kB' 'Shmem: 5632296 kB' 'KReclaimable: 203036 kB' 'Slab: 530388 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327352 kB' 'KernelStack: 16112 kB' 'PageTables: 8032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962488 kB' 'Committed_AS: 7523372 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.126 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.127 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 77851292 kB' 'MemAvailable: 81150224 kB' 'Buffers: 12176 kB' 'Cached: 9470956 kB' 'SwapCached: 0 kB' 'Active: 6557696 kB' 'Inactive: 3457048 kB' 'Active(anon): 6163912 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534792 kB' 'Mapped: 181440 kB' 'Shmem: 5632300 kB' 'KReclaimable: 203036 kB' 'Slab: 530380 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327344 kB' 'KernelStack: 16096 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962488 kB' 'Committed_AS: 7523024 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.128 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.129 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 77851292 kB' 'MemAvailable: 81150224 kB' 'Buffers: 12176 kB' 'Cached: 9470956 kB' 'SwapCached: 0 kB' 'Active: 6557052 kB' 'Inactive: 3457048 kB' 'Active(anon): 6163268 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534196 kB' 'Mapped: 181328 kB' 'Shmem: 5632300 kB' 'KReclaimable: 203036 kB' 'Slab: 530380 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327344 kB' 'KernelStack: 16112 kB' 'PageTables: 8036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962488 kB' 'Committed_AS: 7524160 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.130 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:54.131 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:54.132 nr_hugepages=1536 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:54.132 resv_hugepages=0 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:54.132 surplus_hugepages=0 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:54.132 anon_hugepages=0 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 77850536 kB' 'MemAvailable: 81149468 kB' 'Buffers: 12176 kB' 'Cached: 9470996 kB' 'SwapCached: 0 kB' 'Active: 6556732 kB' 'Inactive: 3457048 kB' 'Active(anon): 6162948 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533824 kB' 'Mapped: 181344 kB' 'Shmem: 5632340 kB' 'KReclaimable: 203036 kB' 'Slab: 530380 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327344 kB' 'KernelStack: 16064 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962488 kB' 'Committed_AS: 7525672 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.132 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 37771204 kB' 'MemUsed: 10298680 kB' 'SwapCached: 0 kB' 'Active: 5275996 kB' 'Inactive: 3294260 kB' 'Active(anon): 5020736 kB' 'Inactive(anon): 0 kB' 'Active(file): 255260 kB' 'Inactive(file): 3294260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8278244 kB' 'Mapped: 118064 kB' 'AnonPages: 295132 kB' 'Shmem: 4728724 kB' 'KernelStack: 9144 kB' 'PageTables: 5032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 98392 kB' 'Slab: 283608 kB' 'SReclaimable: 98392 kB' 'SUnreclaim: 185216 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.133 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.134 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223616 kB' 'MemFree: 40079080 kB' 'MemUsed: 4144536 kB' 'SwapCached: 0 kB' 'Active: 1282052 kB' 'Inactive: 162788 kB' 'Active(anon): 1143528 kB' 'Inactive(anon): 0 kB' 'Active(file): 138524 kB' 'Inactive(file): 162788 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1204976 kB' 'Mapped: 63280 kB' 'AnonPages: 239992 kB' 'Shmem: 903664 kB' 'KernelStack: 7176 kB' 'PageTables: 3468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104644 kB' 'Slab: 246772 kB' 'SReclaimable: 104644 kB' 'SUnreclaim: 142128 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.135 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:54.136 node0=512 expecting 512 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:54.136 node1=1024 expecting 1024 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:54.136 00:04:54.136 real 0m3.931s 00:04:54.136 user 0m1.514s 00:04:54.136 sys 0m2.518s 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:54.136 10:31:28 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:54.136 ************************************ 00:04:54.136 END TEST custom_alloc 00:04:54.136 ************************************ 00:04:54.136 10:31:28 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:54.136 10:31:28 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:54.136 10:31:28 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:54.136 10:31:28 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:54.136 10:31:28 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:54.136 ************************************ 00:04:54.136 START TEST no_shrink_alloc 00:04:54.136 ************************************ 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.136 10:31:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:57.426 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:57.426 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:57.426 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:57.426 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:57.426 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78871188 kB' 'MemAvailable: 82170120 kB' 'Buffers: 12176 kB' 'Cached: 9471096 kB' 'SwapCached: 0 kB' 'Active: 6558044 kB' 'Inactive: 3457048 kB' 'Active(anon): 6164260 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534584 kB' 'Mapped: 181436 kB' 'Shmem: 5632440 kB' 'KReclaimable: 203036 kB' 'Slab: 530756 kB' 'SReclaimable: 203036 kB' 'SUnreclaim: 327720 kB' 'KernelStack: 16096 kB' 'PageTables: 7920 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7523696 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.426 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78871336 kB' 'MemAvailable: 82170252 kB' 'Buffers: 12176 kB' 'Cached: 9471108 kB' 'SwapCached: 0 kB' 'Active: 6558064 kB' 'Inactive: 3457048 kB' 'Active(anon): 6164280 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535204 kB' 'Mapped: 181336 kB' 'Shmem: 5632452 kB' 'KReclaimable: 203004 kB' 'Slab: 530700 kB' 'SReclaimable: 203004 kB' 'SUnreclaim: 327696 kB' 'KernelStack: 16096 kB' 'PageTables: 7964 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7524088 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.427 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.428 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78871524 kB' 'MemAvailable: 82170440 kB' 'Buffers: 12176 kB' 'Cached: 9471124 kB' 'SwapCached: 0 kB' 'Active: 6558104 kB' 'Inactive: 3457048 kB' 'Active(anon): 6164320 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535152 kB' 'Mapped: 181336 kB' 'Shmem: 5632468 kB' 'KReclaimable: 203004 kB' 'Slab: 530700 kB' 'SReclaimable: 203004 kB' 'SUnreclaim: 327696 kB' 'KernelStack: 16112 kB' 'PageTables: 8008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7524108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200888 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.429 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:57.430 nr_hugepages=1024 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:57.430 resv_hugepages=0 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:57.430 surplus_hugepages=0 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:57.430 anon_hugepages=0 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78871424 kB' 'MemAvailable: 82170340 kB' 'Buffers: 12176 kB' 'Cached: 9471164 kB' 'SwapCached: 0 kB' 'Active: 6557740 kB' 'Inactive: 3457048 kB' 'Active(anon): 6163956 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534748 kB' 'Mapped: 181336 kB' 'Shmem: 5632508 kB' 'KReclaimable: 203004 kB' 'Slab: 530700 kB' 'SReclaimable: 203004 kB' 'SUnreclaim: 327696 kB' 'KernelStack: 16096 kB' 'PageTables: 7952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7524132 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200888 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.430 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 36697332 kB' 'MemUsed: 11372552 kB' 'SwapCached: 0 kB' 'Active: 5275660 kB' 'Inactive: 3294260 kB' 'Active(anon): 5020400 kB' 'Inactive(anon): 0 kB' 'Active(file): 255260 kB' 'Inactive(file): 3294260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8278320 kB' 'Mapped: 118072 kB' 'AnonPages: 294732 kB' 'Shmem: 4728800 kB' 'KernelStack: 8920 kB' 'PageTables: 4500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 98360 kB' 'Slab: 283772 kB' 'SReclaimable: 98360 kB' 'SUnreclaim: 185412 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.431 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:57.432 node0=1024 expecting 1024 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:57.432 10:31:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:00.718 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:00.718 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:00.718 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:00.718 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:00.718 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:00.718 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78858312 kB' 'MemAvailable: 82157228 kB' 'Buffers: 12176 kB' 'Cached: 9471228 kB' 'SwapCached: 0 kB' 'Active: 6559572 kB' 'Inactive: 3457048 kB' 'Active(anon): 6165788 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 536516 kB' 'Mapped: 181424 kB' 'Shmem: 5632572 kB' 'KReclaimable: 203004 kB' 'Slab: 530744 kB' 'SReclaimable: 203004 kB' 'SUnreclaim: 327740 kB' 'KernelStack: 16096 kB' 'PageTables: 7948 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7525740 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.718 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.719 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78859368 kB' 'MemAvailable: 82158284 kB' 'Buffers: 12176 kB' 'Cached: 9471228 kB' 'SwapCached: 0 kB' 'Active: 6559404 kB' 'Inactive: 3457048 kB' 'Active(anon): 6165620 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 536308 kB' 'Mapped: 181344 kB' 'Shmem: 5632572 kB' 'KReclaimable: 203004 kB' 'Slab: 530728 kB' 'SReclaimable: 203004 kB' 'SUnreclaim: 327724 kB' 'KernelStack: 16288 kB' 'PageTables: 8244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7527068 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.720 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78857716 kB' 'MemAvailable: 82156632 kB' 'Buffers: 12176 kB' 'Cached: 9471248 kB' 'SwapCached: 0 kB' 'Active: 6560248 kB' 'Inactive: 3457048 kB' 'Active(anon): 6166464 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537132 kB' 'Mapped: 181344 kB' 'Shmem: 5632592 kB' 'KReclaimable: 203004 kB' 'Slab: 530728 kB' 'SReclaimable: 203004 kB' 'SUnreclaim: 327724 kB' 'KernelStack: 16256 kB' 'PageTables: 8380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7563788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201048 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.721 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.722 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.723 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:00.724 nr_hugepages=1024 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:00.724 resv_hugepages=0 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:00.724 surplus_hugepages=0 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:00.724 anon_hugepages=0 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293500 kB' 'MemFree: 78857288 kB' 'MemAvailable: 82156204 kB' 'Buffers: 12176 kB' 'Cached: 9471268 kB' 'SwapCached: 0 kB' 'Active: 6559424 kB' 'Inactive: 3457048 kB' 'Active(anon): 6165640 kB' 'Inactive(anon): 0 kB' 'Active(file): 393784 kB' 'Inactive(file): 3457048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 536240 kB' 'Mapped: 181336 kB' 'Shmem: 5632612 kB' 'KReclaimable: 203004 kB' 'Slab: 530720 kB' 'SReclaimable: 203004 kB' 'SUnreclaim: 327716 kB' 'KernelStack: 16304 kB' 'PageTables: 8296 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486776 kB' 'Committed_AS: 7526744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201048 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988580 kB' 'DirectMap2M: 14415872 kB' 'DirectMap1G: 85983232 kB' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.724 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.725 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 36688620 kB' 'MemUsed: 11381264 kB' 'SwapCached: 0 kB' 'Active: 5276268 kB' 'Inactive: 3294260 kB' 'Active(anon): 5021008 kB' 'Inactive(anon): 0 kB' 'Active(file): 255260 kB' 'Inactive(file): 3294260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8278424 kB' 'Mapped: 118080 kB' 'AnonPages: 295252 kB' 'Shmem: 4728904 kB' 'KernelStack: 8888 kB' 'PageTables: 4400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 98360 kB' 'Slab: 283800 kB' 'SReclaimable: 98360 kB' 'SUnreclaim: 185440 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.726 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:00.727 node0=1024 expecting 1024 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:00.727 00:05:00.727 real 0m6.722s 00:05:00.727 user 0m2.402s 00:05:00.727 sys 0m4.318s 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:00.727 10:31:35 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:00.727 ************************************ 00:05:00.727 END TEST no_shrink_alloc 00:05:00.727 ************************************ 00:05:00.727 10:31:35 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:00.727 10:31:35 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:00.727 00:05:00.727 real 0m28.852s 00:05:00.727 user 0m9.778s 00:05:00.727 sys 0m16.906s 00:05:00.727 10:31:35 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:00.727 10:31:35 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:00.727 ************************************ 00:05:00.727 END TEST hugepages 00:05:00.727 ************************************ 00:05:00.727 10:31:35 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:00.727 10:31:35 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:00.727 10:31:35 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:00.727 10:31:35 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:00.727 10:31:35 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:00.727 ************************************ 00:05:00.727 START TEST driver 00:05:00.727 ************************************ 00:05:00.727 10:31:35 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:00.986 * Looking for test storage... 00:05:00.986 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:00.986 10:31:35 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:00.986 10:31:35 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:00.986 10:31:35 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:06.249 10:31:40 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:06.249 10:31:40 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:06.249 10:31:40 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:06.249 10:31:40 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:06.249 ************************************ 00:05:06.249 START TEST guess_driver 00:05:06.249 ************************************ 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:06.249 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:06.249 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:06.249 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:06.249 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:06.249 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:06.249 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:06.249 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:06.249 Looking for driver=vfio-pci 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:06.249 10:31:40 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:09.546 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:09.546 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:09.546 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.546 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:09.546 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:09.546 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.546 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.546 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.547 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:09.548 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:09.548 10:31:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:12.076 10:31:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:12.076 10:31:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:12.076 10:31:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:12.076 10:31:46 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:12.076 10:31:46 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:12.076 10:31:46 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:12.076 10:31:46 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:17.341 00:05:17.341 real 0m11.202s 00:05:17.341 user 0m2.794s 00:05:17.341 sys 0m5.146s 00:05:17.341 10:31:51 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:17.341 10:31:51 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:17.341 ************************************ 00:05:17.341 END TEST guess_driver 00:05:17.341 ************************************ 00:05:17.341 10:31:51 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:05:17.341 00:05:17.341 real 0m16.117s 00:05:17.341 user 0m4.170s 00:05:17.341 sys 0m7.866s 00:05:17.341 10:31:51 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:17.341 10:31:51 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:17.341 ************************************ 00:05:17.341 END TEST driver 00:05:17.341 ************************************ 00:05:17.341 10:31:52 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:17.341 10:31:52 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:17.341 10:31:52 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:17.341 10:31:52 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.341 10:31:52 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:17.341 ************************************ 00:05:17.341 START TEST devices 00:05:17.341 ************************************ 00:05:17.341 10:31:52 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:17.341 * Looking for test storage... 00:05:17.341 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:17.341 10:31:52 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:17.341 10:31:52 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:17.341 10:31:52 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:17.341 10:31:52 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:20.623 10:31:55 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:20.623 10:31:55 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:20.623 10:31:55 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:20.623 10:31:55 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:20.623 10:31:55 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:20.623 10:31:55 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:20.623 10:31:55 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:20.623 10:31:55 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:20.623 10:31:55 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:20.623 10:31:55 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:20.623 10:31:55 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:20.623 10:31:55 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:20.623 10:31:55 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:20.623 10:31:55 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:20.623 10:31:55 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:20.623 10:31:55 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:20.623 10:31:55 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:20.624 10:31:55 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:20.624 10:31:55 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:20.624 10:31:55 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:20.624 10:31:55 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:20.624 10:31:55 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:20.882 No valid GPT data, bailing 00:05:20.882 10:31:55 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:20.882 10:31:55 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:20.882 10:31:55 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:20.882 10:31:55 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:20.882 10:31:55 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:20.882 10:31:55 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:20.882 10:31:55 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:05:20.882 10:31:55 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:05:20.882 10:31:55 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:20.882 10:31:55 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:20.882 10:31:55 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:20.882 10:31:55 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:20.882 10:31:55 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:20.882 10:31:55 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:20.882 10:31:55 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:20.882 10:31:55 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:20.882 ************************************ 00:05:20.882 START TEST nvme_mount 00:05:20.882 ************************************ 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:20.882 10:31:55 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:21.817 Creating new GPT entries in memory. 00:05:21.817 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:21.817 other utilities. 00:05:21.817 10:31:56 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:21.817 10:31:56 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:21.817 10:31:56 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:21.817 10:31:56 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:21.817 10:31:56 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:22.754 Creating new GPT entries in memory. 00:05:22.754 The operation has completed successfully. 00:05:22.754 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:22.754 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:22.754 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1959331 00:05:22.754 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:22.754 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:22.754 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:22.754 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:22.754 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:23.011 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:23.011 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:23.011 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:23.011 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:23.011 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:23.012 10:31:57 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.195 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:27.196 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:27.196 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:27.196 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:27.196 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:27.196 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:27.196 10:32:01 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:27.196 10:32:02 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.476 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:30.477 10:32:05 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:33.757 10:32:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:34.016 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:34.016 00:05:34.016 real 0m13.186s 00:05:34.016 user 0m3.937s 00:05:34.016 sys 0m7.249s 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:34.016 10:32:09 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:34.016 ************************************ 00:05:34.016 END TEST nvme_mount 00:05:34.016 ************************************ 00:05:34.016 10:32:09 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:34.016 10:32:09 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:34.016 10:32:09 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:34.016 10:32:09 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:34.016 10:32:09 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:34.016 ************************************ 00:05:34.016 START TEST dm_mount 00:05:34.016 ************************************ 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:34.016 10:32:09 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:35.391 Creating new GPT entries in memory. 00:05:35.391 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:35.391 other utilities. 00:05:35.391 10:32:10 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:35.391 10:32:10 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:35.391 10:32:10 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:35.391 10:32:10 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:35.391 10:32:10 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:36.327 Creating new GPT entries in memory. 00:05:36.327 The operation has completed successfully. 00:05:36.327 10:32:11 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:36.327 10:32:11 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:36.327 10:32:11 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:36.327 10:32:11 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:36.327 10:32:11 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:37.293 The operation has completed successfully. 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1963597 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:37.293 10:32:12 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.592 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:40.851 10:32:15 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.040 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:45.041 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:45.041 00:05:45.041 real 0m10.432s 00:05:45.041 user 0m2.687s 00:05:45.041 sys 0m4.860s 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:45.041 10:32:19 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:45.041 ************************************ 00:05:45.041 END TEST dm_mount 00:05:45.041 ************************************ 00:05:45.041 10:32:19 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:45.041 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:45.041 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:45.041 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:45.041 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:45.041 10:32:19 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:45.041 00:05:45.041 real 0m27.849s 00:05:45.041 user 0m7.919s 00:05:45.041 sys 0m14.874s 00:05:45.041 10:32:19 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:45.041 10:32:19 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:45.041 ************************************ 00:05:45.041 END TEST devices 00:05:45.041 ************************************ 00:05:45.041 10:32:19 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:45.041 00:05:45.041 real 1m39.761s 00:05:45.041 user 0m30.270s 00:05:45.041 sys 0m55.246s 00:05:45.041 10:32:19 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:45.041 10:32:19 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:45.041 ************************************ 00:05:45.041 END TEST setup.sh 00:05:45.041 ************************************ 00:05:45.041 10:32:19 -- common/autotest_common.sh@1142 -- # return 0 00:05:45.041 10:32:19 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:48.324 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:48.324 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:48.324 Hugepages 00:05:48.324 node hugesize free / total 00:05:48.324 node0 1048576kB 0 / 0 00:05:48.324 node0 2048kB 1024 / 1024 00:05:48.324 node1 1048576kB 0 / 0 00:05:48.324 node1 2048kB 1024 / 1024 00:05:48.324 00:05:48.324 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:48.324 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:48.324 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:48.324 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:48.324 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:48.324 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:48.324 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:48.324 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:48.324 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:48.324 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:48.324 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:48.324 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:48.324 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:48.324 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:48.324 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:48.324 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:48.324 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:48.324 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:48.324 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:05:48.324 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:05:48.324 10:32:23 -- spdk/autotest.sh@130 -- # uname -s 00:05:48.324 10:32:23 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:48.324 10:32:23 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:48.324 10:32:23 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:51.601 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:51.601 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:51.601 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:51.601 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:54.122 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:54.122 10:32:29 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:55.052 10:32:30 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:55.052 10:32:30 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:55.052 10:32:30 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:55.052 10:32:30 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:55.052 10:32:30 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:55.052 10:32:30 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:55.053 10:32:30 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:55.053 10:32:30 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:55.053 10:32:30 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:55.309 10:32:30 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:55.309 10:32:30 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:55.309 10:32:30 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:58.588 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:58.588 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:58.588 Waiting for block devices as requested 00:05:58.588 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:05:58.588 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:58.588 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:58.588 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:58.846 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:58.846 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:58.846 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:59.104 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:59.104 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:59.104 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:59.361 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:59.362 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:59.362 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:59.619 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:59.619 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:59.619 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:59.878 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:59.878 10:32:34 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:59.878 10:32:34 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:59.878 10:32:34 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:05:59.878 10:32:34 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:05:59.878 10:32:34 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:59.878 10:32:34 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:59.878 10:32:34 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:59.878 10:32:34 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:59.878 10:32:34 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:59.878 10:32:34 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:59.878 10:32:34 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:59.878 10:32:34 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:59.878 10:32:34 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:59.878 10:32:34 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:05:59.878 10:32:34 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:59.878 10:32:34 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:59.878 10:32:34 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:59.878 10:32:34 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:59.878 10:32:34 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:59.878 10:32:34 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:59.878 10:32:34 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:59.878 10:32:34 -- common/autotest_common.sh@1557 -- # continue 00:05:59.878 10:32:34 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:59.878 10:32:34 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:59.878 10:32:34 -- common/autotest_common.sh@10 -- # set +x 00:05:59.878 10:32:34 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:59.878 10:32:34 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:59.878 10:32:34 -- common/autotest_common.sh@10 -- # set +x 00:05:59.878 10:32:34 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:04.061 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:04.061 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:04.061 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:04.061 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:06.684 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:06.684 10:32:41 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:06:06.684 10:32:41 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:06.684 10:32:41 -- common/autotest_common.sh@10 -- # set +x 00:06:06.684 10:32:41 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:06:06.684 10:32:41 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:06:06.684 10:32:41 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:06:06.684 10:32:41 -- common/autotest_common.sh@1577 -- # bdfs=() 00:06:06.684 10:32:41 -- common/autotest_common.sh@1577 -- # local bdfs 00:06:06.684 10:32:41 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:06:06.684 10:32:41 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:06.684 10:32:41 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:06.684 10:32:41 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:06.684 10:32:41 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:06.684 10:32:41 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:06.684 10:32:41 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:06.684 10:32:41 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:06.684 10:32:41 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:06:06.684 10:32:41 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:06:06.684 10:32:41 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:06:06.684 10:32:41 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:06:06.684 10:32:41 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:06:06.684 10:32:41 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:06:06.684 10:32:41 -- common/autotest_common.sh@1593 -- # return 0 00:06:06.684 10:32:41 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:06.684 10:32:41 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:06.684 10:32:41 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:06.684 10:32:41 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:06.684 10:32:41 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:07.251 Restarting all devices. 00:06:10.535 lstat() error: No such file or directory 00:06:10.535 QAT Error: No GENERAL section found 00:06:10.535 Failed to configure qat_dev0 00:06:10.535 lstat() error: No such file or directory 00:06:10.535 QAT Error: No GENERAL section found 00:06:10.535 Failed to configure qat_dev1 00:06:10.535 lstat() error: No such file or directory 00:06:10.535 QAT Error: No GENERAL section found 00:06:10.535 Failed to configure qat_dev2 00:06:10.535 enable sriov 00:06:10.535 Checking status of all devices. 00:06:10.535 There is 3 QAT acceleration device(s) in the system: 00:06:10.535 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:06:10.535 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:06:10.535 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:06:11.468 0000:3d:00.0 set to 16 VFs 00:06:12.404 0000:3f:00.0 set to 16 VFs 00:06:12.973 0000:da:00.0 set to 16 VFs 00:06:14.348 Properly configured the qat device with driver uio_pci_generic. 00:06:14.348 10:32:49 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:14.348 10:32:49 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:14.348 10:32:49 -- common/autotest_common.sh@10 -- # set +x 00:06:14.348 10:32:49 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:14.348 10:32:49 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:14.348 10:32:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.348 10:32:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.348 10:32:49 -- common/autotest_common.sh@10 -- # set +x 00:06:14.348 ************************************ 00:06:14.348 START TEST env 00:06:14.348 ************************************ 00:06:14.348 10:32:49 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:14.606 * Looking for test storage... 00:06:14.606 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:14.606 10:32:49 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:14.606 10:32:49 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.606 10:32:49 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.606 10:32:49 env -- common/autotest_common.sh@10 -- # set +x 00:06:14.606 ************************************ 00:06:14.606 START TEST env_memory 00:06:14.606 ************************************ 00:06:14.606 10:32:49 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:14.606 00:06:14.606 00:06:14.606 CUnit - A unit testing framework for C - Version 2.1-3 00:06:14.606 http://cunit.sourceforge.net/ 00:06:14.606 00:06:14.606 00:06:14.606 Suite: memory 00:06:14.606 Test: alloc and free memory map ...[2024-07-12 10:32:49.663126] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:14.606 passed 00:06:14.606 Test: mem map translation ...[2024-07-12 10:32:49.692310] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:14.606 [2024-07-12 10:32:49.692333] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:14.606 [2024-07-12 10:32:49.692388] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:14.606 [2024-07-12 10:32:49.692402] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:14.606 passed 00:06:14.606 Test: mem map registration ...[2024-07-12 10:32:49.750020] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:14.606 [2024-07-12 10:32:49.750042] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:14.606 passed 00:06:14.866 Test: mem map adjacent registrations ...passed 00:06:14.866 00:06:14.866 Run Summary: Type Total Ran Passed Failed Inactive 00:06:14.866 suites 1 1 n/a 0 0 00:06:14.866 tests 4 4 4 0 0 00:06:14.866 asserts 152 152 152 0 n/a 00:06:14.866 00:06:14.866 Elapsed time = 0.206 seconds 00:06:14.866 00:06:14.866 real 0m0.221s 00:06:14.866 user 0m0.203s 00:06:14.866 sys 0m0.017s 00:06:14.866 10:32:49 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:14.866 10:32:49 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:14.866 ************************************ 00:06:14.866 END TEST env_memory 00:06:14.866 ************************************ 00:06:14.866 10:32:49 env -- common/autotest_common.sh@1142 -- # return 0 00:06:14.866 10:32:49 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:14.866 10:32:49 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.866 10:32:49 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.866 10:32:49 env -- common/autotest_common.sh@10 -- # set +x 00:06:14.866 ************************************ 00:06:14.866 START TEST env_vtophys 00:06:14.866 ************************************ 00:06:14.866 10:32:49 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:14.866 EAL: lib.eal log level changed from notice to debug 00:06:14.866 EAL: Detected lcore 0 as core 0 on socket 0 00:06:14.866 EAL: Detected lcore 1 as core 1 on socket 0 00:06:14.866 EAL: Detected lcore 2 as core 2 on socket 0 00:06:14.866 EAL: Detected lcore 3 as core 3 on socket 0 00:06:14.866 EAL: Detected lcore 4 as core 4 on socket 0 00:06:14.866 EAL: Detected lcore 5 as core 8 on socket 0 00:06:14.866 EAL: Detected lcore 6 as core 9 on socket 0 00:06:14.866 EAL: Detected lcore 7 as core 10 on socket 0 00:06:14.866 EAL: Detected lcore 8 as core 11 on socket 0 00:06:14.866 EAL: Detected lcore 9 as core 16 on socket 0 00:06:14.866 EAL: Detected lcore 10 as core 17 on socket 0 00:06:14.866 EAL: Detected lcore 11 as core 18 on socket 0 00:06:14.866 EAL: Detected lcore 12 as core 19 on socket 0 00:06:14.866 EAL: Detected lcore 13 as core 20 on socket 0 00:06:14.866 EAL: Detected lcore 14 as core 24 on socket 0 00:06:14.866 EAL: Detected lcore 15 as core 25 on socket 0 00:06:14.866 EAL: Detected lcore 16 as core 26 on socket 0 00:06:14.866 EAL: Detected lcore 17 as core 27 on socket 0 00:06:14.866 EAL: Detected lcore 18 as core 0 on socket 1 00:06:14.866 EAL: Detected lcore 19 as core 1 on socket 1 00:06:14.866 EAL: Detected lcore 20 as core 2 on socket 1 00:06:14.866 EAL: Detected lcore 21 as core 3 on socket 1 00:06:14.866 EAL: Detected lcore 22 as core 4 on socket 1 00:06:14.866 EAL: Detected lcore 23 as core 8 on socket 1 00:06:14.866 EAL: Detected lcore 24 as core 9 on socket 1 00:06:14.866 EAL: Detected lcore 25 as core 10 on socket 1 00:06:14.866 EAL: Detected lcore 26 as core 11 on socket 1 00:06:14.866 EAL: Detected lcore 27 as core 16 on socket 1 00:06:14.866 EAL: Detected lcore 28 as core 17 on socket 1 00:06:14.866 EAL: Detected lcore 29 as core 18 on socket 1 00:06:14.866 EAL: Detected lcore 30 as core 19 on socket 1 00:06:14.866 EAL: Detected lcore 31 as core 20 on socket 1 00:06:14.866 EAL: Detected lcore 32 as core 24 on socket 1 00:06:14.866 EAL: Detected lcore 33 as core 25 on socket 1 00:06:14.866 EAL: Detected lcore 34 as core 26 on socket 1 00:06:14.866 EAL: Detected lcore 35 as core 27 on socket 1 00:06:14.866 EAL: Detected lcore 36 as core 0 on socket 0 00:06:14.866 EAL: Detected lcore 37 as core 1 on socket 0 00:06:14.866 EAL: Detected lcore 38 as core 2 on socket 0 00:06:14.866 EAL: Detected lcore 39 as core 3 on socket 0 00:06:14.866 EAL: Detected lcore 40 as core 4 on socket 0 00:06:14.866 EAL: Detected lcore 41 as core 8 on socket 0 00:06:14.866 EAL: Detected lcore 42 as core 9 on socket 0 00:06:14.866 EAL: Detected lcore 43 as core 10 on socket 0 00:06:14.866 EAL: Detected lcore 44 as core 11 on socket 0 00:06:14.866 EAL: Detected lcore 45 as core 16 on socket 0 00:06:14.866 EAL: Detected lcore 46 as core 17 on socket 0 00:06:14.866 EAL: Detected lcore 47 as core 18 on socket 0 00:06:14.866 EAL: Detected lcore 48 as core 19 on socket 0 00:06:14.866 EAL: Detected lcore 49 as core 20 on socket 0 00:06:14.866 EAL: Detected lcore 50 as core 24 on socket 0 00:06:14.866 EAL: Detected lcore 51 as core 25 on socket 0 00:06:14.866 EAL: Detected lcore 52 as core 26 on socket 0 00:06:14.866 EAL: Detected lcore 53 as core 27 on socket 0 00:06:14.866 EAL: Detected lcore 54 as core 0 on socket 1 00:06:14.866 EAL: Detected lcore 55 as core 1 on socket 1 00:06:14.866 EAL: Detected lcore 56 as core 2 on socket 1 00:06:14.866 EAL: Detected lcore 57 as core 3 on socket 1 00:06:14.866 EAL: Detected lcore 58 as core 4 on socket 1 00:06:14.866 EAL: Detected lcore 59 as core 8 on socket 1 00:06:14.866 EAL: Detected lcore 60 as core 9 on socket 1 00:06:14.866 EAL: Detected lcore 61 as core 10 on socket 1 00:06:14.866 EAL: Detected lcore 62 as core 11 on socket 1 00:06:14.866 EAL: Detected lcore 63 as core 16 on socket 1 00:06:14.866 EAL: Detected lcore 64 as core 17 on socket 1 00:06:14.866 EAL: Detected lcore 65 as core 18 on socket 1 00:06:14.866 EAL: Detected lcore 66 as core 19 on socket 1 00:06:14.866 EAL: Detected lcore 67 as core 20 on socket 1 00:06:14.866 EAL: Detected lcore 68 as core 24 on socket 1 00:06:14.866 EAL: Detected lcore 69 as core 25 on socket 1 00:06:14.866 EAL: Detected lcore 70 as core 26 on socket 1 00:06:14.866 EAL: Detected lcore 71 as core 27 on socket 1 00:06:14.866 EAL: Maximum logical cores by configuration: 128 00:06:14.866 EAL: Detected CPU lcores: 72 00:06:14.866 EAL: Detected NUMA nodes: 2 00:06:14.866 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:14.866 EAL: Detected shared linkage of DPDK 00:06:14.866 EAL: No shared files mode enabled, IPC will be disabled 00:06:14.866 EAL: No shared files mode enabled, IPC is disabled 00:06:14.866 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:06:14.866 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:06:14.867 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:06:14.867 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:06:14.867 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:06:14.867 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:06:14.867 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:06:14.867 EAL: Bus pci wants IOVA as 'PA' 00:06:14.867 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:14.867 EAL: Bus vdev wants IOVA as 'DC' 00:06:14.867 EAL: Selected IOVA mode 'PA' 00:06:14.867 EAL: Probing VFIO support... 00:06:14.867 EAL: IOMMU type 1 (Type 1) is supported 00:06:14.867 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:14.867 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:14.867 EAL: VFIO support initialized 00:06:14.867 EAL: Ask a virtual area of 0x2e000 bytes 00:06:14.867 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:14.867 EAL: Setting up physically contiguous memory... 00:06:14.867 EAL: Setting maximum number of open files to 524288 00:06:14.867 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:14.867 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:14.867 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:14.867 EAL: Ask a virtual area of 0x61000 bytes 00:06:14.867 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:14.867 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:14.867 EAL: Ask a virtual area of 0x400000000 bytes 00:06:14.867 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:14.867 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:14.867 EAL: Ask a virtual area of 0x61000 bytes 00:06:14.867 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:14.867 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:14.867 EAL: Ask a virtual area of 0x400000000 bytes 00:06:14.867 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:14.867 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:14.867 EAL: Ask a virtual area of 0x61000 bytes 00:06:14.867 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:14.867 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:14.867 EAL: Ask a virtual area of 0x400000000 bytes 00:06:14.867 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:14.867 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:14.867 EAL: Ask a virtual area of 0x61000 bytes 00:06:14.867 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:14.867 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:14.867 EAL: Ask a virtual area of 0x400000000 bytes 00:06:14.867 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:14.867 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:14.867 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:14.867 EAL: Ask a virtual area of 0x61000 bytes 00:06:14.867 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:14.867 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:14.867 EAL: Ask a virtual area of 0x400000000 bytes 00:06:14.867 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:14.867 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:14.867 EAL: Ask a virtual area of 0x61000 bytes 00:06:14.867 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:14.867 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:14.867 EAL: Ask a virtual area of 0x400000000 bytes 00:06:14.867 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:14.867 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:14.867 EAL: Ask a virtual area of 0x61000 bytes 00:06:14.867 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:14.867 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:14.867 EAL: Ask a virtual area of 0x400000000 bytes 00:06:14.867 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:14.867 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:14.867 EAL: Ask a virtual area of 0x61000 bytes 00:06:14.867 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:14.867 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:14.867 EAL: Ask a virtual area of 0x400000000 bytes 00:06:14.867 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:14.867 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:14.867 EAL: Hugepages will be freed exactly as allocated. 00:06:14.867 EAL: No shared files mode enabled, IPC is disabled 00:06:14.867 EAL: No shared files mode enabled, IPC is disabled 00:06:14.867 EAL: TSC frequency is ~2300000 KHz 00:06:14.867 EAL: Main lcore 0 is ready (tid=7f1a96c17b00;cpuset=[0]) 00:06:14.867 EAL: Trying to obtain current memory policy. 00:06:14.867 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:14.867 EAL: Restoring previous memory policy: 0 00:06:14.867 EAL: request: mp_malloc_sync 00:06:14.867 EAL: No shared files mode enabled, IPC is disabled 00:06:14.867 EAL: Heap on socket 0 was expanded by 2MB 00:06:14.867 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001000000 00:06:14.867 EAL: PCI memory mapped at 0x202001001000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001002000 00:06:14.867 EAL: PCI memory mapped at 0x202001003000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001004000 00:06:14.867 EAL: PCI memory mapped at 0x202001005000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001006000 00:06:14.867 EAL: PCI memory mapped at 0x202001007000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001008000 00:06:14.867 EAL: PCI memory mapped at 0x202001009000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x20200100a000 00:06:14.867 EAL: PCI memory mapped at 0x20200100b000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x20200100c000 00:06:14.867 EAL: PCI memory mapped at 0x20200100d000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x20200100e000 00:06:14.867 EAL: PCI memory mapped at 0x20200100f000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001010000 00:06:14.867 EAL: PCI memory mapped at 0x202001011000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001012000 00:06:14.867 EAL: PCI memory mapped at 0x202001013000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001014000 00:06:14.867 EAL: PCI memory mapped at 0x202001015000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001016000 00:06:14.867 EAL: PCI memory mapped at 0x202001017000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001018000 00:06:14.867 EAL: PCI memory mapped at 0x202001019000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x20200101a000 00:06:14.867 EAL: PCI memory mapped at 0x20200101b000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x20200101c000 00:06:14.867 EAL: PCI memory mapped at 0x20200101d000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:14.867 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x20200101e000 00:06:14.867 EAL: PCI memory mapped at 0x20200101f000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:14.867 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001020000 00:06:14.867 EAL: PCI memory mapped at 0x202001021000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:14.867 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001022000 00:06:14.867 EAL: PCI memory mapped at 0x202001023000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:14.867 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001024000 00:06:14.867 EAL: PCI memory mapped at 0x202001025000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:14.867 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001026000 00:06:14.867 EAL: PCI memory mapped at 0x202001027000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:14.867 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:06:14.867 EAL: probe driver: 8086:37c9 qat 00:06:14.867 EAL: PCI memory mapped at 0x202001028000 00:06:14.867 EAL: PCI memory mapped at 0x202001029000 00:06:14.867 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200102a000 00:06:14.868 EAL: PCI memory mapped at 0x20200102b000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200102c000 00:06:14.868 EAL: PCI memory mapped at 0x20200102d000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200102e000 00:06:14.868 EAL: PCI memory mapped at 0x20200102f000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001030000 00:06:14.868 EAL: PCI memory mapped at 0x202001031000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001032000 00:06:14.868 EAL: PCI memory mapped at 0x202001033000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001034000 00:06:14.868 EAL: PCI memory mapped at 0x202001035000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001036000 00:06:14.868 EAL: PCI memory mapped at 0x202001037000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001038000 00:06:14.868 EAL: PCI memory mapped at 0x202001039000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200103a000 00:06:14.868 EAL: PCI memory mapped at 0x20200103b000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200103c000 00:06:14.868 EAL: PCI memory mapped at 0x20200103d000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:14.868 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200103e000 00:06:14.868 EAL: PCI memory mapped at 0x20200103f000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:14.868 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001040000 00:06:14.868 EAL: PCI memory mapped at 0x202001041000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:14.868 EAL: Trying to obtain current memory policy. 00:06:14.868 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:14.868 EAL: Restoring previous memory policy: 4 00:06:14.868 EAL: request: mp_malloc_sync 00:06:14.868 EAL: No shared files mode enabled, IPC is disabled 00:06:14.868 EAL: Heap on socket 1 was expanded by 2MB 00:06:14.868 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001042000 00:06:14.868 EAL: PCI memory mapped at 0x202001043000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001044000 00:06:14.868 EAL: PCI memory mapped at 0x202001045000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001046000 00:06:14.868 EAL: PCI memory mapped at 0x202001047000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001048000 00:06:14.868 EAL: PCI memory mapped at 0x202001049000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200104a000 00:06:14.868 EAL: PCI memory mapped at 0x20200104b000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200104c000 00:06:14.868 EAL: PCI memory mapped at 0x20200104d000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200104e000 00:06:14.868 EAL: PCI memory mapped at 0x20200104f000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001050000 00:06:14.868 EAL: PCI memory mapped at 0x202001051000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001052000 00:06:14.868 EAL: PCI memory mapped at 0x202001053000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001054000 00:06:14.868 EAL: PCI memory mapped at 0x202001055000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001056000 00:06:14.868 EAL: PCI memory mapped at 0x202001057000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x202001058000 00:06:14.868 EAL: PCI memory mapped at 0x202001059000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200105a000 00:06:14.868 EAL: PCI memory mapped at 0x20200105b000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200105c000 00:06:14.868 EAL: PCI memory mapped at 0x20200105d000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:14.868 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:06:14.868 EAL: probe driver: 8086:37c9 qat 00:06:14.868 EAL: PCI memory mapped at 0x20200105e000 00:06:14.868 EAL: PCI memory mapped at 0x20200105f000 00:06:14.868 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:14.868 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:15.127 EAL: Mem event callback 'spdk:(nil)' registered 00:06:15.127 00:06:15.127 00:06:15.127 CUnit - A unit testing framework for C - Version 2.1-3 00:06:15.127 http://cunit.sourceforge.net/ 00:06:15.127 00:06:15.127 00:06:15.127 Suite: components_suite 00:06:15.127 Test: vtophys_malloc_test ...passed 00:06:15.127 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:15.127 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.127 EAL: Restoring previous memory policy: 4 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was expanded by 4MB 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was shrunk by 4MB 00:06:15.127 EAL: Trying to obtain current memory policy. 00:06:15.127 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.127 EAL: Restoring previous memory policy: 4 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was expanded by 6MB 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was shrunk by 6MB 00:06:15.127 EAL: Trying to obtain current memory policy. 00:06:15.127 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.127 EAL: Restoring previous memory policy: 4 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was expanded by 10MB 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was shrunk by 10MB 00:06:15.127 EAL: Trying to obtain current memory policy. 00:06:15.127 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.127 EAL: Restoring previous memory policy: 4 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was expanded by 18MB 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was shrunk by 18MB 00:06:15.127 EAL: Trying to obtain current memory policy. 00:06:15.127 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.127 EAL: Restoring previous memory policy: 4 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was expanded by 34MB 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was shrunk by 34MB 00:06:15.127 EAL: Trying to obtain current memory policy. 00:06:15.127 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.127 EAL: Restoring previous memory policy: 4 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was expanded by 66MB 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was shrunk by 66MB 00:06:15.127 EAL: Trying to obtain current memory policy. 00:06:15.127 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.127 EAL: Restoring previous memory policy: 4 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was expanded by 130MB 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was shrunk by 130MB 00:06:15.127 EAL: Trying to obtain current memory policy. 00:06:15.127 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.127 EAL: Restoring previous memory policy: 4 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.127 EAL: request: mp_malloc_sync 00:06:15.127 EAL: No shared files mode enabled, IPC is disabled 00:06:15.127 EAL: Heap on socket 0 was expanded by 258MB 00:06:15.127 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.385 EAL: request: mp_malloc_sync 00:06:15.385 EAL: No shared files mode enabled, IPC is disabled 00:06:15.385 EAL: Heap on socket 0 was shrunk by 258MB 00:06:15.385 EAL: Trying to obtain current memory policy. 00:06:15.385 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.385 EAL: Restoring previous memory policy: 4 00:06:15.385 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.385 EAL: request: mp_malloc_sync 00:06:15.385 EAL: No shared files mode enabled, IPC is disabled 00:06:15.385 EAL: Heap on socket 0 was expanded by 514MB 00:06:15.385 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.643 EAL: request: mp_malloc_sync 00:06:15.643 EAL: No shared files mode enabled, IPC is disabled 00:06:15.643 EAL: Heap on socket 0 was shrunk by 514MB 00:06:15.643 EAL: Trying to obtain current memory policy. 00:06:15.643 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:15.901 EAL: Restoring previous memory policy: 4 00:06:15.901 EAL: Calling mem event callback 'spdk:(nil)' 00:06:15.901 EAL: request: mp_malloc_sync 00:06:15.901 EAL: No shared files mode enabled, IPC is disabled 00:06:15.901 EAL: Heap on socket 0 was expanded by 1026MB 00:06:15.901 EAL: Calling mem event callback 'spdk:(nil)' 00:06:16.159 EAL: request: mp_malloc_sync 00:06:16.159 EAL: No shared files mode enabled, IPC is disabled 00:06:16.159 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:16.159 passed 00:06:16.159 00:06:16.159 Run Summary: Type Total Ran Passed Failed Inactive 00:06:16.159 suites 1 1 n/a 0 0 00:06:16.159 tests 2 2 2 0 0 00:06:16.159 asserts 5624 5624 5624 0 n/a 00:06:16.159 00:06:16.159 Elapsed time = 1.175 seconds 00:06:16.159 EAL: No shared files mode enabled, IPC is disabled 00:06:16.159 EAL: No shared files mode enabled, IPC is disabled 00:06:16.159 EAL: No shared files mode enabled, IPC is disabled 00:06:16.159 00:06:16.159 real 0m1.366s 00:06:16.159 user 0m0.759s 00:06:16.159 sys 0m0.579s 00:06:16.159 10:32:51 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:16.159 10:32:51 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:16.159 ************************************ 00:06:16.159 END TEST env_vtophys 00:06:16.159 ************************************ 00:06:16.159 10:32:51 env -- common/autotest_common.sh@1142 -- # return 0 00:06:16.159 10:32:51 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:16.159 10:32:51 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:16.159 10:32:51 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.159 10:32:51 env -- common/autotest_common.sh@10 -- # set +x 00:06:16.418 ************************************ 00:06:16.419 START TEST env_pci 00:06:16.419 ************************************ 00:06:16.419 10:32:51 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:16.419 00:06:16.419 00:06:16.419 CUnit - A unit testing framework for C - Version 2.1-3 00:06:16.419 http://cunit.sourceforge.net/ 00:06:16.419 00:06:16.419 00:06:16.419 Suite: pci 00:06:16.419 Test: pci_hook ...[2024-07-12 10:32:51.385619] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1974188 has claimed it 00:06:16.419 EAL: Cannot find device (10000:00:01.0) 00:06:16.419 EAL: Failed to attach device on primary process 00:06:16.419 passed 00:06:16.419 00:06:16.419 Run Summary: Type Total Ran Passed Failed Inactive 00:06:16.419 suites 1 1 n/a 0 0 00:06:16.419 tests 1 1 1 0 0 00:06:16.419 asserts 25 25 25 0 n/a 00:06:16.419 00:06:16.419 Elapsed time = 0.042 seconds 00:06:16.419 00:06:16.419 real 0m0.070s 00:06:16.419 user 0m0.021s 00:06:16.419 sys 0m0.048s 00:06:16.419 10:32:51 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:16.419 10:32:51 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:16.419 ************************************ 00:06:16.419 END TEST env_pci 00:06:16.419 ************************************ 00:06:16.419 10:32:51 env -- common/autotest_common.sh@1142 -- # return 0 00:06:16.419 10:32:51 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:16.419 10:32:51 env -- env/env.sh@15 -- # uname 00:06:16.419 10:32:51 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:16.419 10:32:51 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:16.419 10:32:51 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:16.419 10:32:51 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:06:16.419 10:32:51 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.419 10:32:51 env -- common/autotest_common.sh@10 -- # set +x 00:06:16.419 ************************************ 00:06:16.419 START TEST env_dpdk_post_init 00:06:16.419 ************************************ 00:06:16.419 10:32:51 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:16.419 EAL: Detected CPU lcores: 72 00:06:16.419 EAL: Detected NUMA nodes: 2 00:06:16.419 EAL: Detected shared linkage of DPDK 00:06:16.419 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:16.419 EAL: Selected IOVA mode 'PA' 00:06:16.419 EAL: VFIO support initialized 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.419 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:16.419 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:16.420 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.420 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:16.421 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:16.421 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:16.421 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:16.680 EAL: Using IOMMU type 1 (Type 1) 00:06:16.680 EAL: Ignore mapping IO port bar(1) 00:06:16.680 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:06:16.680 EAL: Ignore mapping IO port bar(1) 00:06:16.680 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:06:16.680 EAL: Ignore mapping IO port bar(1) 00:06:16.680 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:06:16.680 EAL: Ignore mapping IO port bar(1) 00:06:16.680 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:06:16.680 EAL: Ignore mapping IO port bar(1) 00:06:16.680 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:06:16.680 EAL: Ignore mapping IO port bar(1) 00:06:16.680 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:06:16.680 EAL: Ignore mapping IO port bar(1) 00:06:16.680 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:06:16.680 EAL: Ignore mapping IO port bar(1) 00:06:16.680 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:06:16.938 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:06:16.938 EAL: Ignore mapping IO port bar(1) 00:06:16.938 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:06:16.938 EAL: Ignore mapping IO port bar(1) 00:06:16.938 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:06:16.938 EAL: Ignore mapping IO port bar(1) 00:06:16.938 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:06:16.938 EAL: Ignore mapping IO port bar(1) 00:06:16.938 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:06:16.938 EAL: Ignore mapping IO port bar(1) 00:06:16.938 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:06:16.938 EAL: Ignore mapping IO port bar(1) 00:06:16.938 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:06:16.938 EAL: Ignore mapping IO port bar(1) 00:06:16.938 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:06:16.938 EAL: Ignore mapping IO port bar(1) 00:06:16.938 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:06:16.938 EAL: Ignore mapping IO port bar(1) 00:06:16.938 EAL: Ignore mapping IO port bar(5) 00:06:16.938 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:06:17.195 EAL: Ignore mapping IO port bar(1) 00:06:17.195 EAL: Ignore mapping IO port bar(5) 00:06:17.195 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:06:19.721 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:06:19.721 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:06:19.721 Starting DPDK initialization... 00:06:19.721 Starting SPDK post initialization... 00:06:19.721 SPDK NVMe probe 00:06:19.721 Attaching to 0000:5e:00.0 00:06:19.721 Attached to 0000:5e:00.0 00:06:19.721 Cleaning up... 00:06:19.721 00:06:19.721 real 0m3.296s 00:06:19.721 user 0m2.251s 00:06:19.721 sys 0m0.604s 00:06:19.721 10:32:54 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.721 10:32:54 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:19.721 ************************************ 00:06:19.721 END TEST env_dpdk_post_init 00:06:19.721 ************************************ 00:06:19.721 10:32:54 env -- common/autotest_common.sh@1142 -- # return 0 00:06:19.721 10:32:54 env -- env/env.sh@26 -- # uname 00:06:19.721 10:32:54 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:19.722 10:32:54 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:19.722 10:32:54 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:19.722 10:32:54 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.722 10:32:54 env -- common/autotest_common.sh@10 -- # set +x 00:06:19.722 ************************************ 00:06:19.722 START TEST env_mem_callbacks 00:06:19.722 ************************************ 00:06:19.722 10:32:54 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:19.722 EAL: Detected CPU lcores: 72 00:06:19.722 EAL: Detected NUMA nodes: 2 00:06:19.722 EAL: Detected shared linkage of DPDK 00:06:19.722 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:19.981 EAL: Selected IOVA mode 'PA' 00:06:19.981 EAL: VFIO support initialized 00:06:19.981 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.981 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.981 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.981 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.981 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.981 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.981 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.981 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.981 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.981 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:19.981 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.982 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:19.982 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.982 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.983 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:19.983 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.983 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:19.983 00:06:19.983 00:06:19.983 CUnit - A unit testing framework for C - Version 2.1-3 00:06:19.983 http://cunit.sourceforge.net/ 00:06:19.983 00:06:19.983 00:06:19.983 Suite: memory 00:06:19.983 Test: test ... 00:06:19.983 register 0x200000200000 2097152 00:06:19.983 register 0x201000a00000 2097152 00:06:19.983 malloc 3145728 00:06:19.983 register 0x200000400000 4194304 00:06:19.983 buf 0x200000500000 len 3145728 PASSED 00:06:19.983 malloc 64 00:06:19.983 buf 0x2000004fff40 len 64 PASSED 00:06:19.983 malloc 4194304 00:06:19.983 register 0x200000800000 6291456 00:06:19.983 buf 0x200000a00000 len 4194304 PASSED 00:06:19.983 free 0x200000500000 3145728 00:06:19.983 free 0x2000004fff40 64 00:06:19.983 unregister 0x200000400000 4194304 PASSED 00:06:19.983 free 0x200000a00000 4194304 00:06:19.983 unregister 0x200000800000 6291456 PASSED 00:06:19.983 malloc 8388608 00:06:19.983 register 0x200000400000 10485760 00:06:19.983 buf 0x200000600000 len 8388608 PASSED 00:06:19.983 free 0x200000600000 8388608 00:06:19.983 unregister 0x200000400000 10485760 PASSED 00:06:19.983 passed 00:06:19.983 00:06:19.983 Run Summary: Type Total Ran Passed Failed Inactive 00:06:19.983 suites 1 1 n/a 0 0 00:06:19.983 tests 1 1 1 0 0 00:06:19.983 asserts 16 16 16 0 n/a 00:06:19.983 00:06:19.983 Elapsed time = 0.006 seconds 00:06:19.983 00:06:19.983 real 0m0.109s 00:06:19.983 user 0m0.035s 00:06:19.983 sys 0m0.073s 00:06:19.983 10:32:54 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.983 10:32:54 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:19.983 ************************************ 00:06:19.983 END TEST env_mem_callbacks 00:06:19.983 ************************************ 00:06:19.983 10:32:55 env -- common/autotest_common.sh@1142 -- # return 0 00:06:19.983 00:06:19.983 real 0m5.564s 00:06:19.983 user 0m3.471s 00:06:19.983 sys 0m1.666s 00:06:19.983 10:32:55 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.983 10:32:55 env -- common/autotest_common.sh@10 -- # set +x 00:06:19.983 ************************************ 00:06:19.983 END TEST env 00:06:19.983 ************************************ 00:06:19.983 10:32:55 -- common/autotest_common.sh@1142 -- # return 0 00:06:19.983 10:32:55 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:19.983 10:32:55 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:19.983 10:32:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.983 10:32:55 -- common/autotest_common.sh@10 -- # set +x 00:06:19.983 ************************************ 00:06:19.983 START TEST rpc 00:06:19.983 ************************************ 00:06:19.983 10:32:55 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:20.241 * Looking for test storage... 00:06:20.241 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:20.241 10:32:55 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1974839 00:06:20.241 10:32:55 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:20.241 10:32:55 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1974839 00:06:20.241 10:32:55 rpc -- common/autotest_common.sh@829 -- # '[' -z 1974839 ']' 00:06:20.241 10:32:55 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:20.241 10:32:55 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:20.241 10:32:55 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:20.241 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:20.241 10:32:55 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:20.241 10:32:55 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:20.241 10:32:55 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:20.241 [2024-07-12 10:32:55.294320] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:06:20.241 [2024-07-12 10:32:55.294387] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1974839 ] 00:06:20.241 [2024-07-12 10:32:55.421862] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.499 [2024-07-12 10:32:55.526452] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:20.499 [2024-07-12 10:32:55.526503] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1974839' to capture a snapshot of events at runtime. 00:06:20.499 [2024-07-12 10:32:55.526518] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:20.499 [2024-07-12 10:32:55.526530] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:20.499 [2024-07-12 10:32:55.526541] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1974839 for offline analysis/debug. 00:06:20.499 [2024-07-12 10:32:55.526572] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.436 10:32:56 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:21.436 10:32:56 rpc -- common/autotest_common.sh@862 -- # return 0 00:06:21.436 10:32:56 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:21.436 10:32:56 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:21.436 10:32:56 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:21.436 10:32:56 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:21.436 10:32:56 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:21.436 10:32:56 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.436 10:32:56 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:21.436 ************************************ 00:06:21.436 START TEST rpc_integrity 00:06:21.436 ************************************ 00:06:21.436 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:21.436 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:21.436 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.436 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.436 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.436 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:21.436 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:21.436 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:21.437 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:21.437 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.437 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.437 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.437 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:21.437 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:21.437 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.437 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.437 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.437 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:21.437 { 00:06:21.437 "name": "Malloc0", 00:06:21.437 "aliases": [ 00:06:21.437 "ce2ef1c0-41ec-4722-9108-0f9e12fa3455" 00:06:21.437 ], 00:06:21.437 "product_name": "Malloc disk", 00:06:21.437 "block_size": 512, 00:06:21.437 "num_blocks": 16384, 00:06:21.437 "uuid": "ce2ef1c0-41ec-4722-9108-0f9e12fa3455", 00:06:21.437 "assigned_rate_limits": { 00:06:21.437 "rw_ios_per_sec": 0, 00:06:21.437 "rw_mbytes_per_sec": 0, 00:06:21.437 "r_mbytes_per_sec": 0, 00:06:21.437 "w_mbytes_per_sec": 0 00:06:21.437 }, 00:06:21.437 "claimed": false, 00:06:21.437 "zoned": false, 00:06:21.437 "supported_io_types": { 00:06:21.437 "read": true, 00:06:21.437 "write": true, 00:06:21.437 "unmap": true, 00:06:21.437 "flush": true, 00:06:21.437 "reset": true, 00:06:21.437 "nvme_admin": false, 00:06:21.437 "nvme_io": false, 00:06:21.437 "nvme_io_md": false, 00:06:21.437 "write_zeroes": true, 00:06:21.437 "zcopy": true, 00:06:21.437 "get_zone_info": false, 00:06:21.437 "zone_management": false, 00:06:21.437 "zone_append": false, 00:06:21.437 "compare": false, 00:06:21.437 "compare_and_write": false, 00:06:21.437 "abort": true, 00:06:21.437 "seek_hole": false, 00:06:21.437 "seek_data": false, 00:06:21.437 "copy": true, 00:06:21.437 "nvme_iov_md": false 00:06:21.437 }, 00:06:21.437 "memory_domains": [ 00:06:21.437 { 00:06:21.437 "dma_device_id": "system", 00:06:21.437 "dma_device_type": 1 00:06:21.437 }, 00:06:21.437 { 00:06:21.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.437 "dma_device_type": 2 00:06:21.437 } 00:06:21.437 ], 00:06:21.437 "driver_specific": {} 00:06:21.437 } 00:06:21.437 ]' 00:06:21.437 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:21.729 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:21.729 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:21.729 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.729 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.729 [2024-07-12 10:32:56.644368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:21.729 [2024-07-12 10:32:56.644406] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:21.729 [2024-07-12 10:32:56.644426] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x220ceb0 00:06:21.729 [2024-07-12 10:32:56.644439] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:21.729 [2024-07-12 10:32:56.645947] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:21.729 [2024-07-12 10:32:56.645975] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:21.729 Passthru0 00:06:21.729 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.729 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:21.729 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.729 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.729 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.729 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:21.729 { 00:06:21.729 "name": "Malloc0", 00:06:21.729 "aliases": [ 00:06:21.729 "ce2ef1c0-41ec-4722-9108-0f9e12fa3455" 00:06:21.729 ], 00:06:21.729 "product_name": "Malloc disk", 00:06:21.729 "block_size": 512, 00:06:21.730 "num_blocks": 16384, 00:06:21.730 "uuid": "ce2ef1c0-41ec-4722-9108-0f9e12fa3455", 00:06:21.730 "assigned_rate_limits": { 00:06:21.730 "rw_ios_per_sec": 0, 00:06:21.730 "rw_mbytes_per_sec": 0, 00:06:21.730 "r_mbytes_per_sec": 0, 00:06:21.730 "w_mbytes_per_sec": 0 00:06:21.730 }, 00:06:21.730 "claimed": true, 00:06:21.730 "claim_type": "exclusive_write", 00:06:21.730 "zoned": false, 00:06:21.730 "supported_io_types": { 00:06:21.730 "read": true, 00:06:21.730 "write": true, 00:06:21.730 "unmap": true, 00:06:21.730 "flush": true, 00:06:21.730 "reset": true, 00:06:21.730 "nvme_admin": false, 00:06:21.730 "nvme_io": false, 00:06:21.730 "nvme_io_md": false, 00:06:21.730 "write_zeroes": true, 00:06:21.730 "zcopy": true, 00:06:21.730 "get_zone_info": false, 00:06:21.730 "zone_management": false, 00:06:21.730 "zone_append": false, 00:06:21.730 "compare": false, 00:06:21.730 "compare_and_write": false, 00:06:21.730 "abort": true, 00:06:21.730 "seek_hole": false, 00:06:21.730 "seek_data": false, 00:06:21.730 "copy": true, 00:06:21.730 "nvme_iov_md": false 00:06:21.730 }, 00:06:21.730 "memory_domains": [ 00:06:21.730 { 00:06:21.730 "dma_device_id": "system", 00:06:21.730 "dma_device_type": 1 00:06:21.730 }, 00:06:21.730 { 00:06:21.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.730 "dma_device_type": 2 00:06:21.730 } 00:06:21.730 ], 00:06:21.730 "driver_specific": {} 00:06:21.730 }, 00:06:21.730 { 00:06:21.730 "name": "Passthru0", 00:06:21.730 "aliases": [ 00:06:21.730 "c7a7dc44-4097-53ce-81d4-21924d3d613a" 00:06:21.730 ], 00:06:21.730 "product_name": "passthru", 00:06:21.730 "block_size": 512, 00:06:21.730 "num_blocks": 16384, 00:06:21.730 "uuid": "c7a7dc44-4097-53ce-81d4-21924d3d613a", 00:06:21.730 "assigned_rate_limits": { 00:06:21.730 "rw_ios_per_sec": 0, 00:06:21.730 "rw_mbytes_per_sec": 0, 00:06:21.730 "r_mbytes_per_sec": 0, 00:06:21.730 "w_mbytes_per_sec": 0 00:06:21.730 }, 00:06:21.730 "claimed": false, 00:06:21.730 "zoned": false, 00:06:21.730 "supported_io_types": { 00:06:21.730 "read": true, 00:06:21.730 "write": true, 00:06:21.730 "unmap": true, 00:06:21.730 "flush": true, 00:06:21.730 "reset": true, 00:06:21.730 "nvme_admin": false, 00:06:21.730 "nvme_io": false, 00:06:21.730 "nvme_io_md": false, 00:06:21.730 "write_zeroes": true, 00:06:21.730 "zcopy": true, 00:06:21.730 "get_zone_info": false, 00:06:21.730 "zone_management": false, 00:06:21.730 "zone_append": false, 00:06:21.730 "compare": false, 00:06:21.730 "compare_and_write": false, 00:06:21.730 "abort": true, 00:06:21.730 "seek_hole": false, 00:06:21.730 "seek_data": false, 00:06:21.730 "copy": true, 00:06:21.730 "nvme_iov_md": false 00:06:21.730 }, 00:06:21.730 "memory_domains": [ 00:06:21.730 { 00:06:21.730 "dma_device_id": "system", 00:06:21.730 "dma_device_type": 1 00:06:21.730 }, 00:06:21.730 { 00:06:21.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.730 "dma_device_type": 2 00:06:21.730 } 00:06:21.730 ], 00:06:21.730 "driver_specific": { 00:06:21.730 "passthru": { 00:06:21.730 "name": "Passthru0", 00:06:21.730 "base_bdev_name": "Malloc0" 00:06:21.730 } 00:06:21.730 } 00:06:21.730 } 00:06:21.730 ]' 00:06:21.730 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:21.730 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:21.730 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.730 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.730 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.730 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:21.730 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:21.730 10:32:56 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:21.730 00:06:21.730 real 0m0.354s 00:06:21.730 user 0m0.238s 00:06:21.730 sys 0m0.049s 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:21.730 10:32:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.730 ************************************ 00:06:21.730 END TEST rpc_integrity 00:06:21.730 ************************************ 00:06:22.016 10:32:56 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:22.016 10:32:56 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:22.016 10:32:56 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:22.016 10:32:56 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.016 10:32:56 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.016 ************************************ 00:06:22.016 START TEST rpc_plugins 00:06:22.016 ************************************ 00:06:22.016 10:32:56 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:06:22.016 10:32:56 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:22.016 10:32:56 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.016 10:32:56 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:22.016 10:32:56 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.016 10:32:56 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:22.016 10:32:56 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:22.016 10:32:56 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.016 10:32:56 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:22.016 10:32:56 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.016 10:32:56 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:22.016 { 00:06:22.016 "name": "Malloc1", 00:06:22.016 "aliases": [ 00:06:22.016 "9a10a678-aa4c-44bf-8b57-a13fe0bd34d1" 00:06:22.016 ], 00:06:22.016 "product_name": "Malloc disk", 00:06:22.016 "block_size": 4096, 00:06:22.016 "num_blocks": 256, 00:06:22.016 "uuid": "9a10a678-aa4c-44bf-8b57-a13fe0bd34d1", 00:06:22.016 "assigned_rate_limits": { 00:06:22.016 "rw_ios_per_sec": 0, 00:06:22.016 "rw_mbytes_per_sec": 0, 00:06:22.016 "r_mbytes_per_sec": 0, 00:06:22.016 "w_mbytes_per_sec": 0 00:06:22.016 }, 00:06:22.016 "claimed": false, 00:06:22.016 "zoned": false, 00:06:22.016 "supported_io_types": { 00:06:22.016 "read": true, 00:06:22.016 "write": true, 00:06:22.016 "unmap": true, 00:06:22.016 "flush": true, 00:06:22.016 "reset": true, 00:06:22.016 "nvme_admin": false, 00:06:22.016 "nvme_io": false, 00:06:22.016 "nvme_io_md": false, 00:06:22.016 "write_zeroes": true, 00:06:22.016 "zcopy": true, 00:06:22.016 "get_zone_info": false, 00:06:22.016 "zone_management": false, 00:06:22.016 "zone_append": false, 00:06:22.016 "compare": false, 00:06:22.016 "compare_and_write": false, 00:06:22.016 "abort": true, 00:06:22.016 "seek_hole": false, 00:06:22.016 "seek_data": false, 00:06:22.016 "copy": true, 00:06:22.016 "nvme_iov_md": false 00:06:22.016 }, 00:06:22.016 "memory_domains": [ 00:06:22.016 { 00:06:22.016 "dma_device_id": "system", 00:06:22.016 "dma_device_type": 1 00:06:22.016 }, 00:06:22.016 { 00:06:22.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:22.016 "dma_device_type": 2 00:06:22.016 } 00:06:22.016 ], 00:06:22.016 "driver_specific": {} 00:06:22.016 } 00:06:22.016 ]' 00:06:22.016 10:32:56 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:22.016 10:32:57 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:22.016 10:32:57 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:22.016 10:32:57 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.016 10:32:57 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:22.016 10:32:57 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.016 10:32:57 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:22.016 10:32:57 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.016 10:32:57 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:22.016 10:32:57 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.016 10:32:57 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:22.016 10:32:57 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:22.016 10:32:57 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:22.016 00:06:22.016 real 0m0.135s 00:06:22.016 user 0m0.083s 00:06:22.016 sys 0m0.019s 00:06:22.016 10:32:57 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:22.016 10:32:57 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:22.016 ************************************ 00:06:22.016 END TEST rpc_plugins 00:06:22.016 ************************************ 00:06:22.016 10:32:57 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:22.016 10:32:57 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:22.016 10:32:57 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:22.016 10:32:57 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.016 10:32:57 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.016 ************************************ 00:06:22.016 START TEST rpc_trace_cmd_test 00:06:22.016 ************************************ 00:06:22.016 10:32:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:06:22.016 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:22.016 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:22.016 10:32:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.016 10:32:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:22.016 10:32:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.016 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:22.016 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1974839", 00:06:22.016 "tpoint_group_mask": "0x8", 00:06:22.016 "iscsi_conn": { 00:06:22.016 "mask": "0x2", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "scsi": { 00:06:22.016 "mask": "0x4", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "bdev": { 00:06:22.016 "mask": "0x8", 00:06:22.016 "tpoint_mask": "0xffffffffffffffff" 00:06:22.016 }, 00:06:22.016 "nvmf_rdma": { 00:06:22.016 "mask": "0x10", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "nvmf_tcp": { 00:06:22.016 "mask": "0x20", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "ftl": { 00:06:22.016 "mask": "0x40", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "blobfs": { 00:06:22.016 "mask": "0x80", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "dsa": { 00:06:22.016 "mask": "0x200", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "thread": { 00:06:22.016 "mask": "0x400", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "nvme_pcie": { 00:06:22.016 "mask": "0x800", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "iaa": { 00:06:22.016 "mask": "0x1000", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "nvme_tcp": { 00:06:22.016 "mask": "0x2000", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "bdev_nvme": { 00:06:22.016 "mask": "0x4000", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 }, 00:06:22.016 "sock": { 00:06:22.016 "mask": "0x8000", 00:06:22.016 "tpoint_mask": "0x0" 00:06:22.016 } 00:06:22.016 }' 00:06:22.016 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:22.274 00:06:22.274 real 0m0.311s 00:06:22.274 user 0m0.281s 00:06:22.274 sys 0m0.022s 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:22.274 10:32:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:22.274 ************************************ 00:06:22.274 END TEST rpc_trace_cmd_test 00:06:22.274 ************************************ 00:06:22.532 10:32:57 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:22.532 10:32:57 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:22.532 10:32:57 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:22.532 10:32:57 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:22.532 10:32:57 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:22.532 10:32:57 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.532 10:32:57 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.532 ************************************ 00:06:22.532 START TEST rpc_daemon_integrity 00:06:22.532 ************************************ 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:22.532 { 00:06:22.532 "name": "Malloc2", 00:06:22.532 "aliases": [ 00:06:22.532 "5a4abf9f-99af-40d8-a45f-c887fe9491d0" 00:06:22.532 ], 00:06:22.532 "product_name": "Malloc disk", 00:06:22.532 "block_size": 512, 00:06:22.532 "num_blocks": 16384, 00:06:22.532 "uuid": "5a4abf9f-99af-40d8-a45f-c887fe9491d0", 00:06:22.532 "assigned_rate_limits": { 00:06:22.532 "rw_ios_per_sec": 0, 00:06:22.532 "rw_mbytes_per_sec": 0, 00:06:22.532 "r_mbytes_per_sec": 0, 00:06:22.532 "w_mbytes_per_sec": 0 00:06:22.532 }, 00:06:22.532 "claimed": false, 00:06:22.532 "zoned": false, 00:06:22.532 "supported_io_types": { 00:06:22.532 "read": true, 00:06:22.532 "write": true, 00:06:22.532 "unmap": true, 00:06:22.532 "flush": true, 00:06:22.532 "reset": true, 00:06:22.532 "nvme_admin": false, 00:06:22.532 "nvme_io": false, 00:06:22.532 "nvme_io_md": false, 00:06:22.532 "write_zeroes": true, 00:06:22.532 "zcopy": true, 00:06:22.532 "get_zone_info": false, 00:06:22.532 "zone_management": false, 00:06:22.532 "zone_append": false, 00:06:22.532 "compare": false, 00:06:22.532 "compare_and_write": false, 00:06:22.532 "abort": true, 00:06:22.532 "seek_hole": false, 00:06:22.532 "seek_data": false, 00:06:22.532 "copy": true, 00:06:22.532 "nvme_iov_md": false 00:06:22.532 }, 00:06:22.532 "memory_domains": [ 00:06:22.532 { 00:06:22.532 "dma_device_id": "system", 00:06:22.532 "dma_device_type": 1 00:06:22.532 }, 00:06:22.532 { 00:06:22.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:22.532 "dma_device_type": 2 00:06:22.532 } 00:06:22.532 ], 00:06:22.532 "driver_specific": {} 00:06:22.532 } 00:06:22.532 ]' 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.532 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.532 [2024-07-12 10:32:57.687352] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:22.532 [2024-07-12 10:32:57.687393] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:22.533 [2024-07-12 10:32:57.687416] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x220db20 00:06:22.533 [2024-07-12 10:32:57.687429] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:22.533 [2024-07-12 10:32:57.688844] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:22.533 [2024-07-12 10:32:57.688872] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:22.533 Passthru0 00:06:22.533 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.533 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:22.533 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.533 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.789 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.789 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:22.789 { 00:06:22.789 "name": "Malloc2", 00:06:22.789 "aliases": [ 00:06:22.789 "5a4abf9f-99af-40d8-a45f-c887fe9491d0" 00:06:22.789 ], 00:06:22.789 "product_name": "Malloc disk", 00:06:22.789 "block_size": 512, 00:06:22.789 "num_blocks": 16384, 00:06:22.789 "uuid": "5a4abf9f-99af-40d8-a45f-c887fe9491d0", 00:06:22.789 "assigned_rate_limits": { 00:06:22.789 "rw_ios_per_sec": 0, 00:06:22.789 "rw_mbytes_per_sec": 0, 00:06:22.789 "r_mbytes_per_sec": 0, 00:06:22.789 "w_mbytes_per_sec": 0 00:06:22.789 }, 00:06:22.789 "claimed": true, 00:06:22.789 "claim_type": "exclusive_write", 00:06:22.789 "zoned": false, 00:06:22.789 "supported_io_types": { 00:06:22.789 "read": true, 00:06:22.789 "write": true, 00:06:22.789 "unmap": true, 00:06:22.789 "flush": true, 00:06:22.789 "reset": true, 00:06:22.789 "nvme_admin": false, 00:06:22.790 "nvme_io": false, 00:06:22.790 "nvme_io_md": false, 00:06:22.790 "write_zeroes": true, 00:06:22.790 "zcopy": true, 00:06:22.790 "get_zone_info": false, 00:06:22.790 "zone_management": false, 00:06:22.790 "zone_append": false, 00:06:22.790 "compare": false, 00:06:22.790 "compare_and_write": false, 00:06:22.790 "abort": true, 00:06:22.790 "seek_hole": false, 00:06:22.790 "seek_data": false, 00:06:22.790 "copy": true, 00:06:22.790 "nvme_iov_md": false 00:06:22.790 }, 00:06:22.790 "memory_domains": [ 00:06:22.790 { 00:06:22.790 "dma_device_id": "system", 00:06:22.790 "dma_device_type": 1 00:06:22.790 }, 00:06:22.790 { 00:06:22.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:22.790 "dma_device_type": 2 00:06:22.790 } 00:06:22.790 ], 00:06:22.790 "driver_specific": {} 00:06:22.790 }, 00:06:22.790 { 00:06:22.790 "name": "Passthru0", 00:06:22.790 "aliases": [ 00:06:22.790 "68b6ebbc-e2c5-50b1-a9cc-c7b98a0c884e" 00:06:22.790 ], 00:06:22.790 "product_name": "passthru", 00:06:22.790 "block_size": 512, 00:06:22.790 "num_blocks": 16384, 00:06:22.790 "uuid": "68b6ebbc-e2c5-50b1-a9cc-c7b98a0c884e", 00:06:22.790 "assigned_rate_limits": { 00:06:22.790 "rw_ios_per_sec": 0, 00:06:22.790 "rw_mbytes_per_sec": 0, 00:06:22.790 "r_mbytes_per_sec": 0, 00:06:22.790 "w_mbytes_per_sec": 0 00:06:22.790 }, 00:06:22.790 "claimed": false, 00:06:22.790 "zoned": false, 00:06:22.790 "supported_io_types": { 00:06:22.790 "read": true, 00:06:22.790 "write": true, 00:06:22.790 "unmap": true, 00:06:22.790 "flush": true, 00:06:22.790 "reset": true, 00:06:22.790 "nvme_admin": false, 00:06:22.790 "nvme_io": false, 00:06:22.790 "nvme_io_md": false, 00:06:22.790 "write_zeroes": true, 00:06:22.790 "zcopy": true, 00:06:22.790 "get_zone_info": false, 00:06:22.790 "zone_management": false, 00:06:22.790 "zone_append": false, 00:06:22.790 "compare": false, 00:06:22.790 "compare_and_write": false, 00:06:22.790 "abort": true, 00:06:22.790 "seek_hole": false, 00:06:22.790 "seek_data": false, 00:06:22.790 "copy": true, 00:06:22.790 "nvme_iov_md": false 00:06:22.790 }, 00:06:22.790 "memory_domains": [ 00:06:22.790 { 00:06:22.790 "dma_device_id": "system", 00:06:22.790 "dma_device_type": 1 00:06:22.790 }, 00:06:22.790 { 00:06:22.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:22.790 "dma_device_type": 2 00:06:22.790 } 00:06:22.790 ], 00:06:22.790 "driver_specific": { 00:06:22.790 "passthru": { 00:06:22.790 "name": "Passthru0", 00:06:22.790 "base_bdev_name": "Malloc2" 00:06:22.790 } 00:06:22.790 } 00:06:22.790 } 00:06:22.790 ]' 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:22.790 00:06:22.790 real 0m0.301s 00:06:22.790 user 0m0.184s 00:06:22.790 sys 0m0.054s 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:22.790 10:32:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:22.790 ************************************ 00:06:22.790 END TEST rpc_daemon_integrity 00:06:22.790 ************************************ 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:22.790 10:32:57 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:22.790 10:32:57 rpc -- rpc/rpc.sh@84 -- # killprocess 1974839 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@948 -- # '[' -z 1974839 ']' 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@952 -- # kill -0 1974839 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@953 -- # uname 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1974839 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1974839' 00:06:22.790 killing process with pid 1974839 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@967 -- # kill 1974839 00:06:22.790 10:32:57 rpc -- common/autotest_common.sh@972 -- # wait 1974839 00:06:23.355 00:06:23.355 real 0m3.190s 00:06:23.355 user 0m4.276s 00:06:23.355 sys 0m0.935s 00:06:23.355 10:32:58 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:23.355 10:32:58 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:23.355 ************************************ 00:06:23.355 END TEST rpc 00:06:23.355 ************************************ 00:06:23.355 10:32:58 -- common/autotest_common.sh@1142 -- # return 0 00:06:23.355 10:32:58 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:23.355 10:32:58 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:23.355 10:32:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.355 10:32:58 -- common/autotest_common.sh@10 -- # set +x 00:06:23.355 ************************************ 00:06:23.355 START TEST skip_rpc 00:06:23.355 ************************************ 00:06:23.355 10:32:58 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:23.355 * Looking for test storage... 00:06:23.355 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:23.355 10:32:58 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:23.355 10:32:58 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:23.355 10:32:58 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:23.355 10:32:58 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:23.355 10:32:58 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.355 10:32:58 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:23.355 ************************************ 00:06:23.355 START TEST skip_rpc 00:06:23.355 ************************************ 00:06:23.355 10:32:58 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:06:23.355 10:32:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1975376 00:06:23.355 10:32:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:23.355 10:32:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:23.355 10:32:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:23.612 [2024-07-12 10:32:58.590136] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:06:23.613 [2024-07-12 10:32:58.590182] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1975376 ] 00:06:23.613 [2024-07-12 10:32:58.700035] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.613 [2024-07-12 10:32:58.800748] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1975376 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 1975376 ']' 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 1975376 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1975376 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1975376' 00:06:28.866 killing process with pid 1975376 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 1975376 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 1975376 00:06:28.866 00:06:28.866 real 0m5.457s 00:06:28.866 user 0m5.097s 00:06:28.866 sys 0m0.367s 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:28.866 10:33:03 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:28.866 ************************************ 00:06:28.866 END TEST skip_rpc 00:06:28.866 ************************************ 00:06:28.866 10:33:04 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:28.866 10:33:04 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:28.866 10:33:04 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:28.866 10:33:04 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:28.866 10:33:04 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:29.124 ************************************ 00:06:29.124 START TEST skip_rpc_with_json 00:06:29.124 ************************************ 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1976239 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1976239 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 1976239 ']' 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:29.124 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:29.124 10:33:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:29.124 [2024-07-12 10:33:04.145389] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:06:29.124 [2024-07-12 10:33:04.145459] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1976239 ] 00:06:29.124 [2024-07-12 10:33:04.276944] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.381 [2024-07-12 10:33:04.388799] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:29.947 [2024-07-12 10:33:05.058490] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:29.947 request: 00:06:29.947 { 00:06:29.947 "trtype": "tcp", 00:06:29.947 "method": "nvmf_get_transports", 00:06:29.947 "req_id": 1 00:06:29.947 } 00:06:29.947 Got JSON-RPC error response 00:06:29.947 response: 00:06:29.947 { 00:06:29.947 "code": -19, 00:06:29.947 "message": "No such device" 00:06:29.947 } 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:29.947 [2024-07-12 10:33:05.070632] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:29.947 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:30.206 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:30.206 10:33:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:30.206 { 00:06:30.206 "subsystems": [ 00:06:30.206 { 00:06:30.206 "subsystem": "keyring", 00:06:30.206 "config": [] 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "subsystem": "iobuf", 00:06:30.206 "config": [ 00:06:30.206 { 00:06:30.206 "method": "iobuf_set_options", 00:06:30.206 "params": { 00:06:30.206 "small_pool_count": 8192, 00:06:30.206 "large_pool_count": 1024, 00:06:30.206 "small_bufsize": 8192, 00:06:30.206 "large_bufsize": 135168 00:06:30.206 } 00:06:30.206 } 00:06:30.206 ] 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "subsystem": "sock", 00:06:30.206 "config": [ 00:06:30.206 { 00:06:30.206 "method": "sock_set_default_impl", 00:06:30.206 "params": { 00:06:30.206 "impl_name": "posix" 00:06:30.206 } 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "method": "sock_impl_set_options", 00:06:30.206 "params": { 00:06:30.206 "impl_name": "ssl", 00:06:30.206 "recv_buf_size": 4096, 00:06:30.206 "send_buf_size": 4096, 00:06:30.206 "enable_recv_pipe": true, 00:06:30.206 "enable_quickack": false, 00:06:30.206 "enable_placement_id": 0, 00:06:30.206 "enable_zerocopy_send_server": true, 00:06:30.206 "enable_zerocopy_send_client": false, 00:06:30.206 "zerocopy_threshold": 0, 00:06:30.206 "tls_version": 0, 00:06:30.206 "enable_ktls": false 00:06:30.206 } 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "method": "sock_impl_set_options", 00:06:30.206 "params": { 00:06:30.206 "impl_name": "posix", 00:06:30.206 "recv_buf_size": 2097152, 00:06:30.206 "send_buf_size": 2097152, 00:06:30.206 "enable_recv_pipe": true, 00:06:30.206 "enable_quickack": false, 00:06:30.206 "enable_placement_id": 0, 00:06:30.206 "enable_zerocopy_send_server": true, 00:06:30.206 "enable_zerocopy_send_client": false, 00:06:30.206 "zerocopy_threshold": 0, 00:06:30.206 "tls_version": 0, 00:06:30.206 "enable_ktls": false 00:06:30.206 } 00:06:30.206 } 00:06:30.206 ] 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "subsystem": "vmd", 00:06:30.206 "config": [] 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "subsystem": "accel", 00:06:30.206 "config": [ 00:06:30.206 { 00:06:30.206 "method": "accel_set_options", 00:06:30.206 "params": { 00:06:30.206 "small_cache_size": 128, 00:06:30.206 "large_cache_size": 16, 00:06:30.206 "task_count": 2048, 00:06:30.206 "sequence_count": 2048, 00:06:30.206 "buf_count": 2048 00:06:30.206 } 00:06:30.206 } 00:06:30.206 ] 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "subsystem": "bdev", 00:06:30.206 "config": [ 00:06:30.206 { 00:06:30.206 "method": "bdev_set_options", 00:06:30.206 "params": { 00:06:30.206 "bdev_io_pool_size": 65535, 00:06:30.206 "bdev_io_cache_size": 256, 00:06:30.206 "bdev_auto_examine": true, 00:06:30.206 "iobuf_small_cache_size": 128, 00:06:30.206 "iobuf_large_cache_size": 16 00:06:30.206 } 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "method": "bdev_raid_set_options", 00:06:30.206 "params": { 00:06:30.206 "process_window_size_kb": 1024 00:06:30.206 } 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "method": "bdev_iscsi_set_options", 00:06:30.206 "params": { 00:06:30.206 "timeout_sec": 30 00:06:30.206 } 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "method": "bdev_nvme_set_options", 00:06:30.206 "params": { 00:06:30.206 "action_on_timeout": "none", 00:06:30.206 "timeout_us": 0, 00:06:30.206 "timeout_admin_us": 0, 00:06:30.206 "keep_alive_timeout_ms": 10000, 00:06:30.206 "arbitration_burst": 0, 00:06:30.206 "low_priority_weight": 0, 00:06:30.206 "medium_priority_weight": 0, 00:06:30.206 "high_priority_weight": 0, 00:06:30.206 "nvme_adminq_poll_period_us": 10000, 00:06:30.206 "nvme_ioq_poll_period_us": 0, 00:06:30.206 "io_queue_requests": 0, 00:06:30.206 "delay_cmd_submit": true, 00:06:30.206 "transport_retry_count": 4, 00:06:30.206 "bdev_retry_count": 3, 00:06:30.206 "transport_ack_timeout": 0, 00:06:30.206 "ctrlr_loss_timeout_sec": 0, 00:06:30.206 "reconnect_delay_sec": 0, 00:06:30.206 "fast_io_fail_timeout_sec": 0, 00:06:30.206 "disable_auto_failback": false, 00:06:30.206 "generate_uuids": false, 00:06:30.206 "transport_tos": 0, 00:06:30.206 "nvme_error_stat": false, 00:06:30.206 "rdma_srq_size": 0, 00:06:30.206 "io_path_stat": false, 00:06:30.206 "allow_accel_sequence": false, 00:06:30.206 "rdma_max_cq_size": 0, 00:06:30.206 "rdma_cm_event_timeout_ms": 0, 00:06:30.206 "dhchap_digests": [ 00:06:30.206 "sha256", 00:06:30.206 "sha384", 00:06:30.206 "sha512" 00:06:30.206 ], 00:06:30.206 "dhchap_dhgroups": [ 00:06:30.206 "null", 00:06:30.206 "ffdhe2048", 00:06:30.206 "ffdhe3072", 00:06:30.206 "ffdhe4096", 00:06:30.206 "ffdhe6144", 00:06:30.206 "ffdhe8192" 00:06:30.206 ] 00:06:30.206 } 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "method": "bdev_nvme_set_hotplug", 00:06:30.206 "params": { 00:06:30.206 "period_us": 100000, 00:06:30.206 "enable": false 00:06:30.206 } 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "method": "bdev_wait_for_examine" 00:06:30.206 } 00:06:30.206 ] 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "subsystem": "scsi", 00:06:30.206 "config": null 00:06:30.206 }, 00:06:30.206 { 00:06:30.206 "subsystem": "scheduler", 00:06:30.206 "config": [ 00:06:30.206 { 00:06:30.206 "method": "framework_set_scheduler", 00:06:30.206 "params": { 00:06:30.206 "name": "static" 00:06:30.206 } 00:06:30.207 } 00:06:30.207 ] 00:06:30.207 }, 00:06:30.207 { 00:06:30.207 "subsystem": "vhost_scsi", 00:06:30.207 "config": [] 00:06:30.207 }, 00:06:30.207 { 00:06:30.207 "subsystem": "vhost_blk", 00:06:30.207 "config": [] 00:06:30.207 }, 00:06:30.207 { 00:06:30.207 "subsystem": "ublk", 00:06:30.207 "config": [] 00:06:30.207 }, 00:06:30.207 { 00:06:30.207 "subsystem": "nbd", 00:06:30.207 "config": [] 00:06:30.207 }, 00:06:30.207 { 00:06:30.207 "subsystem": "nvmf", 00:06:30.207 "config": [ 00:06:30.207 { 00:06:30.207 "method": "nvmf_set_config", 00:06:30.207 "params": { 00:06:30.207 "discovery_filter": "match_any", 00:06:30.207 "admin_cmd_passthru": { 00:06:30.207 "identify_ctrlr": false 00:06:30.207 } 00:06:30.207 } 00:06:30.207 }, 00:06:30.207 { 00:06:30.207 "method": "nvmf_set_max_subsystems", 00:06:30.207 "params": { 00:06:30.207 "max_subsystems": 1024 00:06:30.207 } 00:06:30.207 }, 00:06:30.207 { 00:06:30.207 "method": "nvmf_set_crdt", 00:06:30.207 "params": { 00:06:30.207 "crdt1": 0, 00:06:30.207 "crdt2": 0, 00:06:30.207 "crdt3": 0 00:06:30.207 } 00:06:30.207 }, 00:06:30.207 { 00:06:30.207 "method": "nvmf_create_transport", 00:06:30.207 "params": { 00:06:30.207 "trtype": "TCP", 00:06:30.207 "max_queue_depth": 128, 00:06:30.207 "max_io_qpairs_per_ctrlr": 127, 00:06:30.207 "in_capsule_data_size": 4096, 00:06:30.207 "max_io_size": 131072, 00:06:30.207 "io_unit_size": 131072, 00:06:30.207 "max_aq_depth": 128, 00:06:30.207 "num_shared_buffers": 511, 00:06:30.207 "buf_cache_size": 4294967295, 00:06:30.207 "dif_insert_or_strip": false, 00:06:30.207 "zcopy": false, 00:06:30.207 "c2h_success": true, 00:06:30.207 "sock_priority": 0, 00:06:30.207 "abort_timeout_sec": 1, 00:06:30.207 "ack_timeout": 0, 00:06:30.207 "data_wr_pool_size": 0 00:06:30.207 } 00:06:30.207 } 00:06:30.207 ] 00:06:30.207 }, 00:06:30.207 { 00:06:30.207 "subsystem": "iscsi", 00:06:30.207 "config": [ 00:06:30.207 { 00:06:30.207 "method": "iscsi_set_options", 00:06:30.207 "params": { 00:06:30.207 "node_base": "iqn.2016-06.io.spdk", 00:06:30.207 "max_sessions": 128, 00:06:30.207 "max_connections_per_session": 2, 00:06:30.207 "max_queue_depth": 64, 00:06:30.207 "default_time2wait": 2, 00:06:30.207 "default_time2retain": 20, 00:06:30.207 "first_burst_length": 8192, 00:06:30.207 "immediate_data": true, 00:06:30.207 "allow_duplicated_isid": false, 00:06:30.207 "error_recovery_level": 0, 00:06:30.207 "nop_timeout": 60, 00:06:30.207 "nop_in_interval": 30, 00:06:30.207 "disable_chap": false, 00:06:30.207 "require_chap": false, 00:06:30.207 "mutual_chap": false, 00:06:30.207 "chap_group": 0, 00:06:30.207 "max_large_datain_per_connection": 64, 00:06:30.207 "max_r2t_per_connection": 4, 00:06:30.207 "pdu_pool_size": 36864, 00:06:30.207 "immediate_data_pool_size": 16384, 00:06:30.207 "data_out_pool_size": 2048 00:06:30.207 } 00:06:30.207 } 00:06:30.207 ] 00:06:30.207 } 00:06:30.207 ] 00:06:30.207 } 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1976239 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1976239 ']' 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1976239 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1976239 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1976239' 00:06:30.207 killing process with pid 1976239 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1976239 00:06:30.207 10:33:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1976239 00:06:30.464 10:33:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1976814 00:06:30.464 10:33:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:30.464 10:33:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1976814 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1976814 ']' 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1976814 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1976814 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1976814' 00:06:35.723 killing process with pid 1976814 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1976814 00:06:35.723 10:33:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1976814 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:35.981 00:06:35.981 real 0m7.018s 00:06:35.981 user 0m6.670s 00:06:35.981 sys 0m0.862s 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:35.981 ************************************ 00:06:35.981 END TEST skip_rpc_with_json 00:06:35.981 ************************************ 00:06:35.981 10:33:11 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:35.981 10:33:11 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:35.981 10:33:11 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:35.981 10:33:11 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.981 10:33:11 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.981 ************************************ 00:06:35.981 START TEST skip_rpc_with_delay 00:06:35.981 ************************************ 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:35.981 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:36.238 [2024-07-12 10:33:11.249177] app.c: 831:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:36.238 [2024-07-12 10:33:11.249269] app.c: 710:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:36.238 00:06:36.238 real 0m0.095s 00:06:36.238 user 0m0.060s 00:06:36.238 sys 0m0.034s 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:36.238 10:33:11 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:36.238 ************************************ 00:06:36.238 END TEST skip_rpc_with_delay 00:06:36.238 ************************************ 00:06:36.238 10:33:11 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:36.238 10:33:11 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:36.238 10:33:11 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:36.238 10:33:11 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:36.238 10:33:11 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:36.238 10:33:11 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:36.238 10:33:11 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.238 ************************************ 00:06:36.238 START TEST exit_on_failed_rpc_init 00:06:36.238 ************************************ 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1977714 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1977714 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 1977714 ']' 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:36.239 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:36.239 10:33:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:36.239 [2024-07-12 10:33:11.429640] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:06:36.239 [2024-07-12 10:33:11.429710] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1977714 ] 00:06:36.496 [2024-07-12 10:33:11.561570] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.496 [2024-07-12 10:33:11.658294] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:37.427 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:37.427 [2024-07-12 10:33:12.431556] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:06:37.427 [2024-07-12 10:33:12.431625] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1977800 ] 00:06:37.427 [2024-07-12 10:33:12.550829] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.685 [2024-07-12 10:33:12.656924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:37.685 [2024-07-12 10:33:12.657004] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:37.685 [2024-07-12 10:33:12.657021] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:37.685 [2024-07-12 10:33:12.657032] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1977714 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 1977714 ']' 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 1977714 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1977714 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1977714' 00:06:37.685 killing process with pid 1977714 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 1977714 00:06:37.685 10:33:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 1977714 00:06:38.251 00:06:38.251 real 0m1.847s 00:06:38.251 user 0m2.155s 00:06:38.251 sys 0m0.608s 00:06:38.251 10:33:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:38.251 10:33:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:38.251 ************************************ 00:06:38.251 END TEST exit_on_failed_rpc_init 00:06:38.251 ************************************ 00:06:38.251 10:33:13 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:38.251 10:33:13 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:38.251 00:06:38.251 real 0m14.859s 00:06:38.251 user 0m14.139s 00:06:38.251 sys 0m2.191s 00:06:38.251 10:33:13 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:38.251 10:33:13 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:38.251 ************************************ 00:06:38.251 END TEST skip_rpc 00:06:38.251 ************************************ 00:06:38.251 10:33:13 -- common/autotest_common.sh@1142 -- # return 0 00:06:38.251 10:33:13 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:38.251 10:33:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:38.251 10:33:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.251 10:33:13 -- common/autotest_common.sh@10 -- # set +x 00:06:38.251 ************************************ 00:06:38.251 START TEST rpc_client 00:06:38.251 ************************************ 00:06:38.251 10:33:13 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:38.251 * Looking for test storage... 00:06:38.251 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:38.251 10:33:13 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:38.509 OK 00:06:38.509 10:33:13 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:38.509 00:06:38.509 real 0m0.139s 00:06:38.509 user 0m0.052s 00:06:38.509 sys 0m0.096s 00:06:38.509 10:33:13 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:38.509 10:33:13 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:38.509 ************************************ 00:06:38.509 END TEST rpc_client 00:06:38.509 ************************************ 00:06:38.509 10:33:13 -- common/autotest_common.sh@1142 -- # return 0 00:06:38.509 10:33:13 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:38.509 10:33:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:38.509 10:33:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.509 10:33:13 -- common/autotest_common.sh@10 -- # set +x 00:06:38.509 ************************************ 00:06:38.509 START TEST json_config 00:06:38.509 ************************************ 00:06:38.509 10:33:13 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:38.509 10:33:13 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:38.509 10:33:13 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:38.509 10:33:13 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:38.509 10:33:13 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:38.509 10:33:13 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:38.509 10:33:13 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:38.509 10:33:13 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:38.509 10:33:13 json_config -- paths/export.sh@5 -- # export PATH 00:06:38.509 10:33:13 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@47 -- # : 0 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:38.509 10:33:13 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:38.509 10:33:13 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:38.509 10:33:13 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:38.509 10:33:13 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:38.509 10:33:13 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:38.510 INFO: JSON configuration test init 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:38.510 10:33:13 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:38.510 10:33:13 json_config -- json_config/common.sh@9 -- # local app=target 00:06:38.510 10:33:13 json_config -- json_config/common.sh@10 -- # shift 00:06:38.510 10:33:13 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:38.510 10:33:13 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:38.510 10:33:13 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:38.510 10:33:13 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:38.510 10:33:13 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:38.510 10:33:13 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1978074 00:06:38.510 10:33:13 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:38.510 Waiting for target to run... 00:06:38.510 10:33:13 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:38.510 10:33:13 json_config -- json_config/common.sh@25 -- # waitforlisten 1978074 /var/tmp/spdk_tgt.sock 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@829 -- # '[' -z 1978074 ']' 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:38.510 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:38.510 10:33:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:38.768 [2024-07-12 10:33:13.738845] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:06:38.768 [2024-07-12 10:33:13.738918] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1978074 ] 00:06:39.026 [2024-07-12 10:33:14.086892] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.026 [2024-07-12 10:33:14.177466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.591 10:33:14 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:39.591 10:33:14 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:39.591 10:33:14 json_config -- json_config/common.sh@26 -- # echo '' 00:06:39.591 00:06:39.591 10:33:14 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:39.591 10:33:14 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:39.591 10:33:14 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:39.591 10:33:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:39.591 10:33:14 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:39.591 10:33:14 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:39.591 10:33:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:39.849 10:33:14 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:39.849 10:33:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:40.107 [2024-07-12 10:33:15.140342] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:40.107 10:33:15 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:40.107 10:33:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:40.365 [2024-07-12 10:33:15.384975] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:40.365 10:33:15 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:40.365 10:33:15 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:40.365 10:33:15 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:40.365 10:33:15 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:40.365 10:33:15 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:40.365 10:33:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:40.628 [2024-07-12 10:33:15.698431] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:43.192 10:33:18 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:43.192 10:33:18 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:43.192 10:33:18 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:43.192 10:33:18 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:43.192 10:33:18 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:43.192 10:33:18 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:43.192 10:33:18 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:43.192 10:33:18 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:43.192 10:33:18 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:43.192 10:33:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:43.461 10:33:18 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:43.461 10:33:18 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:43.461 10:33:18 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:43.461 10:33:18 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:43.461 10:33:18 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:43.461 10:33:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:43.718 10:33:18 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:43.718 10:33:18 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:43.718 10:33:18 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:43.718 10:33:18 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:43.718 10:33:18 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:43.718 10:33:18 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:43.718 10:33:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:43.974 Nvme0n1p0 Nvme0n1p1 00:06:43.974 10:33:18 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:43.974 10:33:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:44.231 [2024-07-12 10:33:19.178686] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:44.231 [2024-07-12 10:33:19.178740] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:44.231 00:06:44.231 10:33:19 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:44.231 10:33:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:44.489 Malloc3 00:06:44.489 10:33:19 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:44.489 10:33:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:44.489 [2024-07-12 10:33:19.672080] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:44.489 [2024-07-12 10:33:19.672126] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:44.490 [2024-07-12 10:33:19.672152] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc7fa00 00:06:44.490 [2024-07-12 10:33:19.672165] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:44.490 [2024-07-12 10:33:19.673790] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:44.490 [2024-07-12 10:33:19.673817] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:44.490 PTBdevFromMalloc3 00:06:44.747 10:33:19 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:44.747 10:33:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:44.747 Null0 00:06:44.747 10:33:19 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:44.747 10:33:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:45.004 Malloc0 00:06:45.004 10:33:20 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:45.004 10:33:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:45.261 Malloc1 00:06:45.261 10:33:20 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:45.261 10:33:20 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:45.828 102400+0 records in 00:06:45.828 102400+0 records out 00:06:45.828 104857600 bytes (105 MB, 100 MiB) copied, 0.305316 s, 343 MB/s 00:06:45.828 10:33:20 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:45.828 10:33:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:45.828 aio_disk 00:06:45.828 10:33:20 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:45.828 10:33:20 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:45.828 10:33:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:51.088 7bd3dc6b-328c-44c8-b3d5-e66001f8f54b 00:06:51.088 10:33:25 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:51.088 10:33:25 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:51.088 10:33:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:51.088 10:33:25 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:51.088 10:33:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:51.088 10:33:26 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:51.088 10:33:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:51.346 10:33:26 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:51.346 10:33:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:51.604 10:33:26 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:51.604 10:33:26 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:51.604 10:33:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:51.862 MallocForCryptoBdev 00:06:51.862 10:33:26 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:51.862 10:33:26 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:51.862 10:33:26 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:51.862 10:33:26 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:51.862 10:33:26 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:51.862 10:33:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:52.120 [2024-07-12 10:33:27.167429] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:52.120 CryptoMallocBdev 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:01a395b6-7687-40e6-8882-2b7bff9689d7 bdev_register:497a62a0-be09-401e-9279-edcb158cef89 bdev_register:e8d95e44-0ea7-4fcd-b65d-1e9d81c1e5f5 bdev_register:97c12a51-f20c-4971-b21c-b7ef297b4ccd bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:01a395b6-7687-40e6-8882-2b7bff9689d7 bdev_register:497a62a0-be09-401e-9279-edcb158cef89 bdev_register:e8d95e44-0ea7-4fcd-b65d-1e9d81c1e5f5 bdev_register:97c12a51-f20c-4971-b21c-b7ef297b4ccd bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@71 -- # sort 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@72 -- # sort 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:52.120 10:33:27 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:52.120 10:33:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:01a395b6-7687-40e6-8882-2b7bff9689d7 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:497a62a0-be09-401e-9279-edcb158cef89 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:e8d95e44-0ea7-4fcd-b65d-1e9d81c1e5f5 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:97c12a51-f20c-4971-b21c-b7ef297b4ccd 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:01a395b6-7687-40e6-8882-2b7bff9689d7 bdev_register:497a62a0-be09-401e-9279-edcb158cef89 bdev_register:97c12a51-f20c-4971-b21c-b7ef297b4ccd bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:e8d95e44-0ea7-4fcd-b65d-1e9d81c1e5f5 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\1\a\3\9\5\b\6\-\7\6\8\7\-\4\0\e\6\-\8\8\8\2\-\2\b\7\b\f\f\9\6\8\9\d\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\4\9\7\a\6\2\a\0\-\b\e\0\9\-\4\0\1\e\-\9\2\7\9\-\e\d\c\b\1\5\8\c\e\f\8\9\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\9\7\c\1\2\a\5\1\-\f\2\0\c\-\4\9\7\1\-\b\2\1\c\-\b\7\e\f\2\9\7\b\4\c\c\d\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\e\8\d\9\5\e\4\4\-\0\e\a\7\-\4\f\c\d\-\b\6\5\d\-\1\e\9\d\8\1\c\1\e\5\f\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@86 -- # cat 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:01a395b6-7687-40e6-8882-2b7bff9689d7 bdev_register:497a62a0-be09-401e-9279-edcb158cef89 bdev_register:97c12a51-f20c-4971-b21c-b7ef297b4ccd bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:e8d95e44-0ea7-4fcd-b65d-1e9d81c1e5f5 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:52.379 Expected events matched: 00:06:52.379 bdev_register:01a395b6-7687-40e6-8882-2b7bff9689d7 00:06:52.379 bdev_register:497a62a0-be09-401e-9279-edcb158cef89 00:06:52.379 bdev_register:97c12a51-f20c-4971-b21c-b7ef297b4ccd 00:06:52.379 bdev_register:aio_disk 00:06:52.379 bdev_register:CryptoMallocBdev 00:06:52.379 bdev_register:e8d95e44-0ea7-4fcd-b65d-1e9d81c1e5f5 00:06:52.379 bdev_register:Malloc0 00:06:52.379 bdev_register:Malloc0p0 00:06:52.379 bdev_register:Malloc0p1 00:06:52.379 bdev_register:Malloc0p2 00:06:52.379 bdev_register:Malloc1 00:06:52.379 bdev_register:Malloc3 00:06:52.379 bdev_register:MallocForCryptoBdev 00:06:52.379 bdev_register:Null0 00:06:52.379 bdev_register:Nvme0n1 00:06:52.379 bdev_register:Nvme0n1p0 00:06:52.379 bdev_register:Nvme0n1p1 00:06:52.379 bdev_register:PTBdevFromMalloc3 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:52.379 10:33:27 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:52.379 10:33:27 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:52.379 10:33:27 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:52.379 10:33:27 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:52.379 10:33:27 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:52.379 10:33:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:52.637 MallocBdevForConfigChangeCheck 00:06:52.637 10:33:27 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:52.637 10:33:27 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:52.637 10:33:27 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:52.637 10:33:27 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:52.637 10:33:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:53.203 10:33:28 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:53.203 INFO: shutting down applications... 00:06:53.203 10:33:28 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:53.203 10:33:28 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:53.203 10:33:28 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:53.203 10:33:28 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:53.203 [2024-07-12 10:33:28.331039] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:56.488 Calling clear_iscsi_subsystem 00:06:56.488 Calling clear_nvmf_subsystem 00:06:56.488 Calling clear_nbd_subsystem 00:06:56.488 Calling clear_ublk_subsystem 00:06:56.488 Calling clear_vhost_blk_subsystem 00:06:56.488 Calling clear_vhost_scsi_subsystem 00:06:56.488 Calling clear_bdev_subsystem 00:06:56.488 10:33:31 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:56.488 10:33:31 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:56.488 10:33:31 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:56.488 10:33:31 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:56.488 10:33:31 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:56.488 10:33:31 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:56.488 10:33:31 json_config -- json_config/json_config.sh@345 -- # break 00:06:56.488 10:33:31 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:56.488 10:33:31 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:56.488 10:33:31 json_config -- json_config/common.sh@31 -- # local app=target 00:06:56.488 10:33:31 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:56.488 10:33:31 json_config -- json_config/common.sh@35 -- # [[ -n 1978074 ]] 00:06:56.488 10:33:31 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1978074 00:06:56.488 10:33:31 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:56.488 10:33:31 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:56.488 10:33:31 json_config -- json_config/common.sh@41 -- # kill -0 1978074 00:06:56.488 10:33:31 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:57.055 10:33:32 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:57.055 10:33:32 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:57.055 10:33:32 json_config -- json_config/common.sh@41 -- # kill -0 1978074 00:06:57.055 10:33:32 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:57.055 10:33:32 json_config -- json_config/common.sh@43 -- # break 00:06:57.055 10:33:32 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:57.055 10:33:32 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:57.055 SPDK target shutdown done 00:06:57.055 10:33:32 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:57.055 INFO: relaunching applications... 00:06:57.055 10:33:32 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:57.055 10:33:32 json_config -- json_config/common.sh@9 -- # local app=target 00:06:57.055 10:33:32 json_config -- json_config/common.sh@10 -- # shift 00:06:57.055 10:33:32 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:57.055 10:33:32 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:57.055 10:33:32 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:57.055 10:33:32 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:57.055 10:33:32 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:57.055 10:33:32 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:57.055 10:33:32 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1980630 00:06:57.055 10:33:32 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:57.055 Waiting for target to run... 00:06:57.055 10:33:32 json_config -- json_config/common.sh@25 -- # waitforlisten 1980630 /var/tmp/spdk_tgt.sock 00:06:57.055 10:33:32 json_config -- common/autotest_common.sh@829 -- # '[' -z 1980630 ']' 00:06:57.055 10:33:32 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:57.055 10:33:32 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:57.055 10:33:32 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:57.055 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:57.055 10:33:32 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:57.055 10:33:32 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:57.055 [2024-07-12 10:33:32.239636] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:06:57.055 [2024-07-12 10:33:32.239718] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1980630 ] 00:06:57.624 [2024-07-12 10:33:32.794097] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.883 [2024-07-12 10:33:32.901770] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.883 [2024-07-12 10:33:32.955940] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:57.883 [2024-07-12 10:33:32.963977] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:57.883 [2024-07-12 10:33:32.971995] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:57.883 [2024-07-12 10:33:33.053226] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:00.417 [2024-07-12 10:33:35.261547] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:00.417 [2024-07-12 10:33:35.261612] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:00.417 [2024-07-12 10:33:35.261627] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:00.417 [2024-07-12 10:33:35.269559] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:00.417 [2024-07-12 10:33:35.269586] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:00.417 [2024-07-12 10:33:35.277573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:00.417 [2024-07-12 10:33:35.277597] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:00.417 [2024-07-12 10:33:35.285610] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:07:00.417 [2024-07-12 10:33:35.285637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:07:00.417 [2024-07-12 10:33:35.285649] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:00.675 [2024-07-12 10:33:35.661216] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:00.675 [2024-07-12 10:33:35.661260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:00.675 [2024-07-12 10:33:35.661279] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1454b90 00:07:00.675 [2024-07-12 10:33:35.661291] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:00.675 [2024-07-12 10:33:35.661580] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:00.675 [2024-07-12 10:33:35.661598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:00.675 10:33:35 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:00.675 10:33:35 json_config -- common/autotest_common.sh@862 -- # return 0 00:07:00.675 10:33:35 json_config -- json_config/common.sh@26 -- # echo '' 00:07:00.675 00:07:00.675 10:33:35 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:07:00.676 10:33:35 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:07:00.676 INFO: Checking if target configuration is the same... 00:07:00.676 10:33:35 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:00.676 10:33:35 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:07:00.676 10:33:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:00.676 + '[' 2 -ne 2 ']' 00:07:00.676 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:00.676 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:00.676 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:00.676 +++ basename /dev/fd/62 00:07:00.676 ++ mktemp /tmp/62.XXX 00:07:00.676 + tmp_file_1=/tmp/62.Pkd 00:07:00.676 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:00.676 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:00.676 + tmp_file_2=/tmp/spdk_tgt_config.json.U1m 00:07:00.676 + ret=0 00:07:00.676 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:01.243 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:01.243 + diff -u /tmp/62.Pkd /tmp/spdk_tgt_config.json.U1m 00:07:01.243 + echo 'INFO: JSON config files are the same' 00:07:01.243 INFO: JSON config files are the same 00:07:01.243 + rm /tmp/62.Pkd /tmp/spdk_tgt_config.json.U1m 00:07:01.243 + exit 0 00:07:01.243 10:33:36 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:07:01.243 10:33:36 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:01.243 INFO: changing configuration and checking if this can be detected... 00:07:01.243 10:33:36 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:01.243 10:33:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:01.503 10:33:36 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:01.503 10:33:36 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:07:01.503 10:33:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:01.503 + '[' 2 -ne 2 ']' 00:07:01.503 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:01.503 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:01.503 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:01.503 +++ basename /dev/fd/62 00:07:01.503 ++ mktemp /tmp/62.XXX 00:07:01.503 + tmp_file_1=/tmp/62.loE 00:07:01.503 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:01.503 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:01.503 + tmp_file_2=/tmp/spdk_tgt_config.json.DxA 00:07:01.503 + ret=0 00:07:01.503 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:01.761 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:01.761 + diff -u /tmp/62.loE /tmp/spdk_tgt_config.json.DxA 00:07:02.020 + ret=1 00:07:02.020 + echo '=== Start of file: /tmp/62.loE ===' 00:07:02.020 + cat /tmp/62.loE 00:07:02.020 + echo '=== End of file: /tmp/62.loE ===' 00:07:02.020 + echo '' 00:07:02.021 + echo '=== Start of file: /tmp/spdk_tgt_config.json.DxA ===' 00:07:02.021 + cat /tmp/spdk_tgt_config.json.DxA 00:07:02.021 + echo '=== End of file: /tmp/spdk_tgt_config.json.DxA ===' 00:07:02.021 + echo '' 00:07:02.021 + rm /tmp/62.loE /tmp/spdk_tgt_config.json.DxA 00:07:02.021 + exit 1 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:07:02.021 INFO: configuration change detected. 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:07:02.021 10:33:36 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:02.021 10:33:36 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@317 -- # [[ -n 1980630 ]] 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:07:02.021 10:33:36 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:02.021 10:33:36 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:07:02.021 10:33:36 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:02.021 10:33:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:02.280 10:33:37 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:02.280 10:33:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:07:02.541 10:33:37 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:07:02.541 10:33:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:07:02.541 10:33:37 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:07:02.541 10:33:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:07:02.857 10:33:37 json_config -- json_config/json_config.sh@193 -- # uname -s 00:07:02.857 10:33:37 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:07:02.857 10:33:37 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:07:02.857 10:33:37 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:07:02.857 10:33:37 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:07:02.857 10:33:37 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:02.857 10:33:37 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:02.857 10:33:38 json_config -- json_config/json_config.sh@323 -- # killprocess 1980630 00:07:02.857 10:33:38 json_config -- common/autotest_common.sh@948 -- # '[' -z 1980630 ']' 00:07:02.857 10:33:38 json_config -- common/autotest_common.sh@952 -- # kill -0 1980630 00:07:02.857 10:33:38 json_config -- common/autotest_common.sh@953 -- # uname 00:07:02.857 10:33:38 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:02.857 10:33:38 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1980630 00:07:03.116 10:33:38 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:03.116 10:33:38 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:03.116 10:33:38 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1980630' 00:07:03.116 killing process with pid 1980630 00:07:03.116 10:33:38 json_config -- common/autotest_common.sh@967 -- # kill 1980630 00:07:03.116 10:33:38 json_config -- common/autotest_common.sh@972 -- # wait 1980630 00:07:06.399 10:33:41 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:06.399 10:33:41 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:07:06.399 10:33:41 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:06.399 10:33:41 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:06.399 10:33:41 json_config -- json_config/json_config.sh@328 -- # return 0 00:07:06.399 10:33:41 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:07:06.399 INFO: Success 00:07:06.399 00:07:06.399 real 0m27.804s 00:07:06.399 user 0m33.569s 00:07:06.399 sys 0m3.839s 00:07:06.399 10:33:41 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:06.399 10:33:41 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:06.399 ************************************ 00:07:06.399 END TEST json_config 00:07:06.399 ************************************ 00:07:06.399 10:33:41 -- common/autotest_common.sh@1142 -- # return 0 00:07:06.399 10:33:41 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:06.399 10:33:41 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:06.399 10:33:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:06.399 10:33:41 -- common/autotest_common.sh@10 -- # set +x 00:07:06.399 ************************************ 00:07:06.399 START TEST json_config_extra_key 00:07:06.399 ************************************ 00:07:06.399 10:33:41 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:06.399 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:06.399 10:33:41 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:06.400 10:33:41 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:06.400 10:33:41 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:06.400 10:33:41 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:06.400 10:33:41 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.400 10:33:41 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.400 10:33:41 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.400 10:33:41 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:06.400 10:33:41 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:06.400 10:33:41 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:06.400 INFO: launching applications... 00:07:06.400 10:33:41 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1981980 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:06.400 Waiting for target to run... 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1981980 /var/tmp/spdk_tgt.sock 00:07:06.400 10:33:41 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 1981980 ']' 00:07:06.400 10:33:41 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:06.400 10:33:41 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:06.400 10:33:41 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:06.400 10:33:41 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:06.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:06.400 10:33:41 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:06.400 10:33:41 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:06.659 [2024-07-12 10:33:41.628601] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:06.659 [2024-07-12 10:33:41.628677] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1981980 ] 00:07:07.224 [2024-07-12 10:33:42.181972] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.224 [2024-07-12 10:33:42.291279] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.483 10:33:42 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:07.483 10:33:42 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:07:07.483 10:33:42 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:07.483 00:07:07.483 10:33:42 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:07.483 INFO: shutting down applications... 00:07:07.483 10:33:42 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:07.483 10:33:42 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:07.483 10:33:42 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:07.483 10:33:42 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1981980 ]] 00:07:07.483 10:33:42 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1981980 00:07:07.483 10:33:42 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:07.483 10:33:42 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:07.483 10:33:42 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1981980 00:07:07.483 10:33:42 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:08.048 10:33:43 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:08.048 10:33:43 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:08.048 10:33:43 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1981980 00:07:08.048 10:33:43 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:08.048 10:33:43 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:08.048 10:33:43 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:08.048 10:33:43 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:08.048 SPDK target shutdown done 00:07:08.048 10:33:43 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:08.048 Success 00:07:08.048 00:07:08.048 real 0m1.616s 00:07:08.048 user 0m1.125s 00:07:08.048 sys 0m0.687s 00:07:08.048 10:33:43 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:08.048 10:33:43 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:08.048 ************************************ 00:07:08.048 END TEST json_config_extra_key 00:07:08.048 ************************************ 00:07:08.048 10:33:43 -- common/autotest_common.sh@1142 -- # return 0 00:07:08.048 10:33:43 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:08.048 10:33:43 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:08.048 10:33:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.048 10:33:43 -- common/autotest_common.sh@10 -- # set +x 00:07:08.048 ************************************ 00:07:08.048 START TEST alias_rpc 00:07:08.048 ************************************ 00:07:08.048 10:33:43 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:08.048 * Looking for test storage... 00:07:08.305 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:08.305 10:33:43 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:08.305 10:33:43 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1982326 00:07:08.305 10:33:43 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1982326 00:07:08.305 10:33:43 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.305 10:33:43 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 1982326 ']' 00:07:08.305 10:33:43 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:08.305 10:33:43 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:08.305 10:33:43 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:08.305 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:08.305 10:33:43 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:08.305 10:33:43 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:08.305 [2024-07-12 10:33:43.309523] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:08.305 [2024-07-12 10:33:43.309594] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1982326 ] 00:07:08.305 [2024-07-12 10:33:43.423699] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.562 [2024-07-12 10:33:43.526091] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.125 10:33:44 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:09.125 10:33:44 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:09.125 10:33:44 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:09.381 10:33:44 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1982326 00:07:09.381 10:33:44 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 1982326 ']' 00:07:09.381 10:33:44 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 1982326 00:07:09.381 10:33:44 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:07:09.381 10:33:44 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:09.381 10:33:44 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1982326 00:07:09.637 10:33:44 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:09.637 10:33:44 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:09.637 10:33:44 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1982326' 00:07:09.637 killing process with pid 1982326 00:07:09.637 10:33:44 alias_rpc -- common/autotest_common.sh@967 -- # kill 1982326 00:07:09.637 10:33:44 alias_rpc -- common/autotest_common.sh@972 -- # wait 1982326 00:07:09.893 00:07:09.893 real 0m1.832s 00:07:09.893 user 0m2.042s 00:07:09.893 sys 0m0.558s 00:07:09.893 10:33:44 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:09.893 10:33:44 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:09.893 ************************************ 00:07:09.893 END TEST alias_rpc 00:07:09.893 ************************************ 00:07:09.893 10:33:45 -- common/autotest_common.sh@1142 -- # return 0 00:07:09.893 10:33:45 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:09.893 10:33:45 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:09.893 10:33:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:09.893 10:33:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.893 10:33:45 -- common/autotest_common.sh@10 -- # set +x 00:07:09.893 ************************************ 00:07:09.893 START TEST spdkcli_tcp 00:07:09.893 ************************************ 00:07:09.893 10:33:45 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:10.150 * Looking for test storage... 00:07:10.150 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:10.150 10:33:45 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:10.150 10:33:45 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1982608 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1982608 00:07:10.150 10:33:45 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 1982608 ']' 00:07:10.150 10:33:45 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:10.150 10:33:45 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:10.150 10:33:45 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:10.150 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:10.150 10:33:45 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:10.150 10:33:45 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:10.150 10:33:45 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:10.150 [2024-07-12 10:33:45.235871] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:10.150 [2024-07-12 10:33:45.235939] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1982608 ] 00:07:10.408 [2024-07-12 10:33:45.365541] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:10.408 [2024-07-12 10:33:45.469977] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:10.408 [2024-07-12 10:33:45.469982] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.970 10:33:46 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:10.970 10:33:46 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:07:10.970 10:33:46 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1982775 00:07:10.970 10:33:46 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:10.970 10:33:46 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:11.228 [ 00:07:11.228 "bdev_malloc_delete", 00:07:11.228 "bdev_malloc_create", 00:07:11.228 "bdev_null_resize", 00:07:11.228 "bdev_null_delete", 00:07:11.228 "bdev_null_create", 00:07:11.228 "bdev_nvme_cuse_unregister", 00:07:11.228 "bdev_nvme_cuse_register", 00:07:11.228 "bdev_opal_new_user", 00:07:11.228 "bdev_opal_set_lock_state", 00:07:11.228 "bdev_opal_delete", 00:07:11.228 "bdev_opal_get_info", 00:07:11.228 "bdev_opal_create", 00:07:11.228 "bdev_nvme_opal_revert", 00:07:11.228 "bdev_nvme_opal_init", 00:07:11.228 "bdev_nvme_send_cmd", 00:07:11.228 "bdev_nvme_get_path_iostat", 00:07:11.228 "bdev_nvme_get_mdns_discovery_info", 00:07:11.228 "bdev_nvme_stop_mdns_discovery", 00:07:11.228 "bdev_nvme_start_mdns_discovery", 00:07:11.228 "bdev_nvme_set_multipath_policy", 00:07:11.228 "bdev_nvme_set_preferred_path", 00:07:11.228 "bdev_nvme_get_io_paths", 00:07:11.228 "bdev_nvme_remove_error_injection", 00:07:11.228 "bdev_nvme_add_error_injection", 00:07:11.228 "bdev_nvme_get_discovery_info", 00:07:11.228 "bdev_nvme_stop_discovery", 00:07:11.228 "bdev_nvme_start_discovery", 00:07:11.228 "bdev_nvme_get_controller_health_info", 00:07:11.228 "bdev_nvme_disable_controller", 00:07:11.228 "bdev_nvme_enable_controller", 00:07:11.228 "bdev_nvme_reset_controller", 00:07:11.228 "bdev_nvme_get_transport_statistics", 00:07:11.228 "bdev_nvme_apply_firmware", 00:07:11.228 "bdev_nvme_detach_controller", 00:07:11.228 "bdev_nvme_get_controllers", 00:07:11.228 "bdev_nvme_attach_controller", 00:07:11.228 "bdev_nvme_set_hotplug", 00:07:11.228 "bdev_nvme_set_options", 00:07:11.228 "bdev_passthru_delete", 00:07:11.228 "bdev_passthru_create", 00:07:11.228 "bdev_lvol_set_parent_bdev", 00:07:11.228 "bdev_lvol_set_parent", 00:07:11.228 "bdev_lvol_check_shallow_copy", 00:07:11.228 "bdev_lvol_start_shallow_copy", 00:07:11.228 "bdev_lvol_grow_lvstore", 00:07:11.228 "bdev_lvol_get_lvols", 00:07:11.228 "bdev_lvol_get_lvstores", 00:07:11.228 "bdev_lvol_delete", 00:07:11.228 "bdev_lvol_set_read_only", 00:07:11.228 "bdev_lvol_resize", 00:07:11.228 "bdev_lvol_decouple_parent", 00:07:11.228 "bdev_lvol_inflate", 00:07:11.228 "bdev_lvol_rename", 00:07:11.228 "bdev_lvol_clone_bdev", 00:07:11.228 "bdev_lvol_clone", 00:07:11.228 "bdev_lvol_snapshot", 00:07:11.228 "bdev_lvol_create", 00:07:11.228 "bdev_lvol_delete_lvstore", 00:07:11.228 "bdev_lvol_rename_lvstore", 00:07:11.228 "bdev_lvol_create_lvstore", 00:07:11.228 "bdev_raid_set_options", 00:07:11.228 "bdev_raid_remove_base_bdev", 00:07:11.228 "bdev_raid_add_base_bdev", 00:07:11.228 "bdev_raid_delete", 00:07:11.228 "bdev_raid_create", 00:07:11.228 "bdev_raid_get_bdevs", 00:07:11.228 "bdev_error_inject_error", 00:07:11.228 "bdev_error_delete", 00:07:11.228 "bdev_error_create", 00:07:11.228 "bdev_split_delete", 00:07:11.228 "bdev_split_create", 00:07:11.228 "bdev_delay_delete", 00:07:11.228 "bdev_delay_create", 00:07:11.228 "bdev_delay_update_latency", 00:07:11.228 "bdev_zone_block_delete", 00:07:11.228 "bdev_zone_block_create", 00:07:11.228 "blobfs_create", 00:07:11.228 "blobfs_detect", 00:07:11.228 "blobfs_set_cache_size", 00:07:11.228 "bdev_crypto_delete", 00:07:11.228 "bdev_crypto_create", 00:07:11.228 "bdev_compress_delete", 00:07:11.228 "bdev_compress_create", 00:07:11.228 "bdev_compress_get_orphans", 00:07:11.228 "bdev_aio_delete", 00:07:11.228 "bdev_aio_rescan", 00:07:11.228 "bdev_aio_create", 00:07:11.228 "bdev_ftl_set_property", 00:07:11.228 "bdev_ftl_get_properties", 00:07:11.228 "bdev_ftl_get_stats", 00:07:11.228 "bdev_ftl_unmap", 00:07:11.228 "bdev_ftl_unload", 00:07:11.228 "bdev_ftl_delete", 00:07:11.228 "bdev_ftl_load", 00:07:11.228 "bdev_ftl_create", 00:07:11.228 "bdev_virtio_attach_controller", 00:07:11.228 "bdev_virtio_scsi_get_devices", 00:07:11.228 "bdev_virtio_detach_controller", 00:07:11.228 "bdev_virtio_blk_set_hotplug", 00:07:11.228 "bdev_iscsi_delete", 00:07:11.228 "bdev_iscsi_create", 00:07:11.228 "bdev_iscsi_set_options", 00:07:11.228 "accel_error_inject_error", 00:07:11.228 "ioat_scan_accel_module", 00:07:11.228 "dsa_scan_accel_module", 00:07:11.228 "iaa_scan_accel_module", 00:07:11.228 "dpdk_cryptodev_get_driver", 00:07:11.228 "dpdk_cryptodev_set_driver", 00:07:11.228 "dpdk_cryptodev_scan_accel_module", 00:07:11.228 "compressdev_scan_accel_module", 00:07:11.228 "keyring_file_remove_key", 00:07:11.228 "keyring_file_add_key", 00:07:11.228 "keyring_linux_set_options", 00:07:11.228 "iscsi_get_histogram", 00:07:11.228 "iscsi_enable_histogram", 00:07:11.228 "iscsi_set_options", 00:07:11.228 "iscsi_get_auth_groups", 00:07:11.228 "iscsi_auth_group_remove_secret", 00:07:11.228 "iscsi_auth_group_add_secret", 00:07:11.228 "iscsi_delete_auth_group", 00:07:11.228 "iscsi_create_auth_group", 00:07:11.228 "iscsi_set_discovery_auth", 00:07:11.228 "iscsi_get_options", 00:07:11.228 "iscsi_target_node_request_logout", 00:07:11.228 "iscsi_target_node_set_redirect", 00:07:11.228 "iscsi_target_node_set_auth", 00:07:11.228 "iscsi_target_node_add_lun", 00:07:11.228 "iscsi_get_stats", 00:07:11.228 "iscsi_get_connections", 00:07:11.228 "iscsi_portal_group_set_auth", 00:07:11.228 "iscsi_start_portal_group", 00:07:11.228 "iscsi_delete_portal_group", 00:07:11.228 "iscsi_create_portal_group", 00:07:11.228 "iscsi_get_portal_groups", 00:07:11.228 "iscsi_delete_target_node", 00:07:11.228 "iscsi_target_node_remove_pg_ig_maps", 00:07:11.228 "iscsi_target_node_add_pg_ig_maps", 00:07:11.228 "iscsi_create_target_node", 00:07:11.228 "iscsi_get_target_nodes", 00:07:11.228 "iscsi_delete_initiator_group", 00:07:11.228 "iscsi_initiator_group_remove_initiators", 00:07:11.228 "iscsi_initiator_group_add_initiators", 00:07:11.228 "iscsi_create_initiator_group", 00:07:11.228 "iscsi_get_initiator_groups", 00:07:11.228 "nvmf_set_crdt", 00:07:11.228 "nvmf_set_config", 00:07:11.228 "nvmf_set_max_subsystems", 00:07:11.228 "nvmf_stop_mdns_prr", 00:07:11.228 "nvmf_publish_mdns_prr", 00:07:11.228 "nvmf_subsystem_get_listeners", 00:07:11.228 "nvmf_subsystem_get_qpairs", 00:07:11.228 "nvmf_subsystem_get_controllers", 00:07:11.228 "nvmf_get_stats", 00:07:11.228 "nvmf_get_transports", 00:07:11.228 "nvmf_create_transport", 00:07:11.228 "nvmf_get_targets", 00:07:11.228 "nvmf_delete_target", 00:07:11.228 "nvmf_create_target", 00:07:11.228 "nvmf_subsystem_allow_any_host", 00:07:11.228 "nvmf_subsystem_remove_host", 00:07:11.228 "nvmf_subsystem_add_host", 00:07:11.228 "nvmf_ns_remove_host", 00:07:11.228 "nvmf_ns_add_host", 00:07:11.228 "nvmf_subsystem_remove_ns", 00:07:11.228 "nvmf_subsystem_add_ns", 00:07:11.228 "nvmf_subsystem_listener_set_ana_state", 00:07:11.228 "nvmf_discovery_get_referrals", 00:07:11.228 "nvmf_discovery_remove_referral", 00:07:11.228 "nvmf_discovery_add_referral", 00:07:11.228 "nvmf_subsystem_remove_listener", 00:07:11.228 "nvmf_subsystem_add_listener", 00:07:11.228 "nvmf_delete_subsystem", 00:07:11.228 "nvmf_create_subsystem", 00:07:11.228 "nvmf_get_subsystems", 00:07:11.228 "env_dpdk_get_mem_stats", 00:07:11.228 "nbd_get_disks", 00:07:11.228 "nbd_stop_disk", 00:07:11.228 "nbd_start_disk", 00:07:11.228 "ublk_recover_disk", 00:07:11.228 "ublk_get_disks", 00:07:11.229 "ublk_stop_disk", 00:07:11.229 "ublk_start_disk", 00:07:11.229 "ublk_destroy_target", 00:07:11.229 "ublk_create_target", 00:07:11.229 "virtio_blk_create_transport", 00:07:11.229 "virtio_blk_get_transports", 00:07:11.229 "vhost_controller_set_coalescing", 00:07:11.229 "vhost_get_controllers", 00:07:11.229 "vhost_delete_controller", 00:07:11.229 "vhost_create_blk_controller", 00:07:11.229 "vhost_scsi_controller_remove_target", 00:07:11.229 "vhost_scsi_controller_add_target", 00:07:11.229 "vhost_start_scsi_controller", 00:07:11.229 "vhost_create_scsi_controller", 00:07:11.229 "thread_set_cpumask", 00:07:11.229 "framework_get_governor", 00:07:11.229 "framework_get_scheduler", 00:07:11.229 "framework_set_scheduler", 00:07:11.229 "framework_get_reactors", 00:07:11.229 "thread_get_io_channels", 00:07:11.229 "thread_get_pollers", 00:07:11.229 "thread_get_stats", 00:07:11.229 "framework_monitor_context_switch", 00:07:11.229 "spdk_kill_instance", 00:07:11.229 "log_enable_timestamps", 00:07:11.229 "log_get_flags", 00:07:11.229 "log_clear_flag", 00:07:11.229 "log_set_flag", 00:07:11.229 "log_get_level", 00:07:11.229 "log_set_level", 00:07:11.229 "log_get_print_level", 00:07:11.229 "log_set_print_level", 00:07:11.229 "framework_enable_cpumask_locks", 00:07:11.229 "framework_disable_cpumask_locks", 00:07:11.229 "framework_wait_init", 00:07:11.229 "framework_start_init", 00:07:11.229 "scsi_get_devices", 00:07:11.229 "bdev_get_histogram", 00:07:11.229 "bdev_enable_histogram", 00:07:11.229 "bdev_set_qos_limit", 00:07:11.229 "bdev_set_qd_sampling_period", 00:07:11.229 "bdev_get_bdevs", 00:07:11.229 "bdev_reset_iostat", 00:07:11.229 "bdev_get_iostat", 00:07:11.229 "bdev_examine", 00:07:11.229 "bdev_wait_for_examine", 00:07:11.229 "bdev_set_options", 00:07:11.229 "notify_get_notifications", 00:07:11.229 "notify_get_types", 00:07:11.229 "accel_get_stats", 00:07:11.229 "accel_set_options", 00:07:11.229 "accel_set_driver", 00:07:11.229 "accel_crypto_key_destroy", 00:07:11.229 "accel_crypto_keys_get", 00:07:11.229 "accel_crypto_key_create", 00:07:11.229 "accel_assign_opc", 00:07:11.229 "accel_get_module_info", 00:07:11.229 "accel_get_opc_assignments", 00:07:11.229 "vmd_rescan", 00:07:11.229 "vmd_remove_device", 00:07:11.229 "vmd_enable", 00:07:11.229 "sock_get_default_impl", 00:07:11.229 "sock_set_default_impl", 00:07:11.229 "sock_impl_set_options", 00:07:11.229 "sock_impl_get_options", 00:07:11.229 "iobuf_get_stats", 00:07:11.229 "iobuf_set_options", 00:07:11.229 "framework_get_pci_devices", 00:07:11.229 "framework_get_config", 00:07:11.229 "framework_get_subsystems", 00:07:11.229 "trace_get_info", 00:07:11.229 "trace_get_tpoint_group_mask", 00:07:11.229 "trace_disable_tpoint_group", 00:07:11.229 "trace_enable_tpoint_group", 00:07:11.229 "trace_clear_tpoint_mask", 00:07:11.229 "trace_set_tpoint_mask", 00:07:11.229 "keyring_get_keys", 00:07:11.229 "spdk_get_version", 00:07:11.229 "rpc_get_methods" 00:07:11.229 ] 00:07:11.229 10:33:46 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:11.229 10:33:46 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:11.229 10:33:46 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:11.229 10:33:46 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:11.229 10:33:46 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1982608 00:07:11.229 10:33:46 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 1982608 ']' 00:07:11.229 10:33:46 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 1982608 00:07:11.229 10:33:46 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:07:11.229 10:33:46 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:11.229 10:33:46 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1982608 00:07:11.486 10:33:46 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:11.486 10:33:46 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:11.486 10:33:46 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1982608' 00:07:11.486 killing process with pid 1982608 00:07:11.486 10:33:46 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 1982608 00:07:11.486 10:33:46 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 1982608 00:07:11.743 00:07:11.743 real 0m1.781s 00:07:11.743 user 0m3.151s 00:07:11.743 sys 0m0.615s 00:07:11.743 10:33:46 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:11.743 10:33:46 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:11.743 ************************************ 00:07:11.743 END TEST spdkcli_tcp 00:07:11.743 ************************************ 00:07:11.743 10:33:46 -- common/autotest_common.sh@1142 -- # return 0 00:07:11.743 10:33:46 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:11.743 10:33:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:11.743 10:33:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.743 10:33:46 -- common/autotest_common.sh@10 -- # set +x 00:07:11.743 ************************************ 00:07:11.743 START TEST dpdk_mem_utility 00:07:11.743 ************************************ 00:07:11.743 10:33:46 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:12.001 * Looking for test storage... 00:07:12.001 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:12.001 10:33:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:12.001 10:33:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1982854 00:07:12.001 10:33:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1982854 00:07:12.001 10:33:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:12.001 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 1982854 ']' 00:07:12.001 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:12.001 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:12.001 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:12.001 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:12.001 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:12.001 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:12.001 [2024-07-12 10:33:47.088618] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:12.001 [2024-07-12 10:33:47.088699] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1982854 ] 00:07:12.259 [2024-07-12 10:33:47.210802] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.259 [2024-07-12 10:33:47.312150] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.825 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:12.825 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:07:12.825 10:33:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:12.825 10:33:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:12.825 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:12.825 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:12.825 { 00:07:12.825 "filename": "/tmp/spdk_mem_dump.txt" 00:07:12.825 } 00:07:12.825 10:33:47 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:12.825 10:33:47 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:13.087 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:13.087 2 heaps totaling size 816.000000 MiB 00:07:13.087 size: 814.000000 MiB heap id: 0 00:07:13.087 size: 2.000000 MiB heap id: 1 00:07:13.087 end heaps---------- 00:07:13.087 8 mempools totaling size 598.116089 MiB 00:07:13.087 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:13.087 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:13.087 size: 84.521057 MiB name: bdev_io_1982854 00:07:13.087 size: 51.011292 MiB name: evtpool_1982854 00:07:13.087 size: 50.003479 MiB name: msgpool_1982854 00:07:13.087 size: 21.763794 MiB name: PDU_Pool 00:07:13.087 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:13.087 size: 0.026123 MiB name: Session_Pool 00:07:13.087 end mempools------- 00:07:13.087 201 memzones totaling size 4.176453 MiB 00:07:13.087 size: 1.000366 MiB name: RG_ring_0_1982854 00:07:13.087 size: 1.000366 MiB name: RG_ring_1_1982854 00:07:13.087 size: 1.000366 MiB name: RG_ring_4_1982854 00:07:13.087 size: 1.000366 MiB name: RG_ring_5_1982854 00:07:13.087 size: 0.125366 MiB name: RG_ring_2_1982854 00:07:13.087 size: 0.015991 MiB name: RG_ring_3_1982854 00:07:13.087 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:13.087 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:13.087 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:13.087 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:13.087 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:13.087 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:13.087 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:13.087 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:13.088 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:13.088 size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:13.088 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:13.088 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:13.088 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:13.089 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:13.089 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:13.089 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:13.089 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:13.089 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:13.089 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:13.089 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:13.089 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:13.089 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:13.089 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:13.089 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:13.089 end memzones------- 00:07:13.089 10:33:48 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:13.089 heap id: 0 total size: 814.000000 MiB number of busy elements: 520 number of free elements: 14 00:07:13.089 list of free elements. size: 11.814636 MiB 00:07:13.089 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:13.089 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:13.089 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:13.089 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:13.089 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:13.089 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:13.089 element at address: 0x200007000000 with size: 0.960022 MiB 00:07:13.089 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:13.089 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:07:13.089 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:13.089 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:13.089 element at address: 0x200000800000 with size: 0.486694 MiB 00:07:13.089 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:13.089 element at address: 0x200027e00000 with size: 0.402710 MiB 00:07:13.089 list of standard malloc elements. size: 199.877075 MiB 00:07:13.089 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:13.089 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:13.089 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:13.089 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:13.089 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:13.089 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:13.089 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:13.089 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:13.089 element at address: 0x200000330b40 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000337640 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000033e140 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000344c40 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000034b740 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000352240 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000358d40 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000035f840 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000366880 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000036a340 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000036de00 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000375380 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000378e40 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000037c900 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000383e80 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000387940 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000038b400 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000392980 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000396440 with size: 0.004395 MiB 00:07:13.089 element at address: 0x200000399f00 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:07:13.089 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:13.089 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:07:13.089 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000333040 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000335540 with size: 0.004028 MiB 00:07:13.089 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000339b40 with size: 0.004028 MiB 00:07:13.089 element at address: 0x20000033c040 with size: 0.004028 MiB 00:07:13.089 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000340640 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000342b40 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000347140 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000349640 with size: 0.004028 MiB 00:07:13.089 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:07:13.089 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:07:13.089 element at address: 0x200000350140 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000354740 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000356c40 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000035b240 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000035d740 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000361d40 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000364780 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000365800 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000368240 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000370840 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000373280 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000374300 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000376d40 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000037a800 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000037b880 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000037f340 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000381d80 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000382e00 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000385840 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000389300 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000038a380 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000038de40 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000390880 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000391900 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000394340 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000397e00 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000398e80 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000039c940 with size: 0.004028 MiB 00:07:13.090 element at address: 0x20000039f380 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:13.090 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:13.090 element at address: 0x200000204c80 with size: 0.000305 MiB 00:07:13.090 element at address: 0x200000200000 with size: 0.000183 MiB 00:07:13.090 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200180 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200240 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200300 with size: 0.000183 MiB 00:07:13.090 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200480 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200540 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200600 with size: 0.000183 MiB 00:07:13.090 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200780 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200840 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200900 with size: 0.000183 MiB 00:07:13.090 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200a80 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200b40 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200c00 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200d80 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200e40 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200f00 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201080 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201140 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201200 with size: 0.000183 MiB 00:07:13.090 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201380 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201440 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201500 with size: 0.000183 MiB 00:07:13.090 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201680 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201740 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201800 with size: 0.000183 MiB 00:07:13.090 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201980 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201a40 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201b00 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201c80 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201d40 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201e00 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000201f80 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000202040 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000202100 with size: 0.000183 MiB 00:07:13.090 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000202280 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000202340 with size: 0.000183 MiB 00:07:13.090 element at address: 0x200000202400 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202580 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202640 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202700 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202880 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202940 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202a00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202b80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202c40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202d00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202e80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000202f40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203000 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203180 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203240 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203300 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203480 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203540 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203600 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203780 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203840 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203900 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203a80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203b40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203c00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203d80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203e40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203f00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204080 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204140 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204200 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204380 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204440 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204500 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204680 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204740 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204800 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204980 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204a40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204b00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204dc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204e80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000204f40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205000 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205180 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205240 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205300 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205480 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205540 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205600 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205780 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205840 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205900 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205a80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205b40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205c00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205d80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205e40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205f00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000206080 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000206140 with size: 0.000183 MiB 00:07:13.091 element at address: 0x200000206200 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000020a780 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022af80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b040 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b100 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b280 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b340 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b400 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b580 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b640 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b700 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b900 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022be40 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022c080 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022c140 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022c200 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022c380 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022c440 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000022c500 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000032e700 with size: 0.000183 MiB 00:07:13.091 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000331d40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000338840 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000033f340 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000345e40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000034c940 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000353440 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000359f40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000360a40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000364180 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000364240 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000364400 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000367a80 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000367c40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000367d00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000036b540 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000036b700 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000036b980 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000036f000 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000036f280 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000036f440 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000372c80 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000372d40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000372f00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000376580 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000376740 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000376800 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000037a040 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000037a200 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000037a480 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000037db00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000037df40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000381780 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000381840 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000381a00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000385080 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000385240 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000385300 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000388b40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000388d00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000388f80 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000038c600 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000038c880 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000390280 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000390340 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000390500 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000393b80 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000393d40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000393e00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000397640 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000397800 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x200000397a80 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000039b100 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000039b380 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000039b540 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x20000039f000 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:07:13.092 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:07:13.093 element at address: 0x20000087c980 with size: 0.000183 MiB 00:07:13.093 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:07:13.093 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:13.093 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:13.093 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:13.093 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:13.093 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e67180 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e67240 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6de40 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:13.093 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:13.093 list of memzone associated elements. size: 602.308289 MiB 00:07:13.093 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:13.093 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:13.093 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:13.093 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:13.093 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:13.093 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1982854_0 00:07:13.093 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:13.093 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1982854_0 00:07:13.093 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:13.093 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1982854_0 00:07:13.093 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:13.093 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:13.093 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:13.093 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:13.093 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:13.093 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1982854 00:07:13.093 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:13.093 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1982854 00:07:13.093 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:07:13.093 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1982854 00:07:13.093 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:13.093 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:13.093 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:13.093 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:13.093 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:13.093 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:13.093 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:13.093 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:13.093 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:13.093 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1982854 00:07:13.093 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:13.094 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1982854 00:07:13.094 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:13.094 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1982854 00:07:13.094 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:13.094 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1982854 00:07:13.094 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:13.094 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1982854 00:07:13.094 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:13.094 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:13.094 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:13.094 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:13.094 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:13.094 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:13.094 element at address: 0x20000020a840 with size: 0.125488 MiB 00:07:13.094 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1982854 00:07:13.094 element at address: 0x2000070f5c40 with size: 0.031738 MiB 00:07:13.094 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:13.094 element at address: 0x200027e67300 with size: 0.023743 MiB 00:07:13.094 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:13.094 element at address: 0x200000206580 with size: 0.016113 MiB 00:07:13.094 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1982854 00:07:13.094 element at address: 0x200027e6d440 with size: 0.002441 MiB 00:07:13.094 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:13.094 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:07:13.094 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:13.094 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:13.094 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:13.094 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:13.094 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:13.094 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:13.094 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:13.094 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:13.094 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:13.094 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:13.094 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:13.094 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:13.094 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:13.094 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:13.094 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:13.094 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:13.094 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:13.094 element at address: 0x20000039b700 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:13.094 element at address: 0x200000397c40 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:13.094 element at address: 0x200000394180 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:13.094 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:13.094 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:13.094 element at address: 0x200000389140 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:13.094 element at address: 0x200000385680 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:13.094 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:13.094 element at address: 0x20000037e100 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:13.094 element at address: 0x20000037a640 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:13.094 element at address: 0x200000376b80 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:13.094 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:13.094 element at address: 0x20000036f600 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:13.094 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:13.094 element at address: 0x200000368080 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:13.094 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:13.094 element at address: 0x200000360b00 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:13.094 element at address: 0x20000035d580 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:13.094 element at address: 0x20000035a000 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:13.094 element at address: 0x200000356a80 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:13.094 element at address: 0x200000353500 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:13.094 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:13.094 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:13.094 element at address: 0x200000349480 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:13.094 element at address: 0x200000345f00 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:13.094 element at address: 0x200000342980 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:13.094 element at address: 0x20000033f400 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:13.094 element at address: 0x20000033be80 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:13.094 element at address: 0x200000338900 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:13.094 element at address: 0x200000335380 with size: 0.000427 MiB 00:07:13.094 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:13.094 element at address: 0x200000331e00 with size: 0.000427 MiB 00:07:13.095 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:13.095 element at address: 0x20000032e880 with size: 0.000427 MiB 00:07:13.095 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:13.095 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:07:13.095 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:13.095 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:07:13.095 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1982854 00:07:13.095 element at address: 0x200000206380 with size: 0.000305 MiB 00:07:13.095 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1982854 00:07:13.095 element at address: 0x200027e6df00 with size: 0.000305 MiB 00:07:13.095 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:13.095 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:13.095 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:13.095 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:07:13.095 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:13.095 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:13.095 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:07:13.095 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:13.095 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:13.095 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:07:13.095 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:13.095 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:13.095 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:07:13.095 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:13.095 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:13.095 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:07:13.095 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:13.095 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:13.095 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:07:13.095 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:13.095 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:13.095 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:07:13.095 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:13.095 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:13.095 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:07:13.095 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:13.095 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:13.095 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:07:13.095 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:13.095 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:13.095 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:07:13.095 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:13.095 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:13.095 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:07:13.095 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:13.095 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:13.095 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:07:13.095 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:13.095 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:13.095 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:07:13.095 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:13.095 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:13.095 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:07:13.095 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:13.095 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:13.095 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:07:13.095 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:13.095 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:13.095 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:07:13.095 element at address: 0x20000039b600 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:13.095 element at address: 0x20000039b440 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:13.095 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:07:13.095 element at address: 0x200000397b40 with size: 0.000244 MiB 00:07:13.095 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:13.096 element at address: 0x200000397980 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:13.096 element at address: 0x200000397700 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:07:13.096 element at address: 0x200000394080 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:13.096 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:13.096 element at address: 0x200000393c40 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:07:13.096 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:13.096 element at address: 0x200000390400 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:13.096 element at address: 0x200000390180 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:07:13.096 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:13.096 element at address: 0x20000038c940 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:13.096 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:07:13.096 element at address: 0x200000389040 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:13.096 element at address: 0x200000388e80 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:13.096 element at address: 0x200000388c00 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:07:13.096 element at address: 0x200000385580 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:13.096 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:13.096 element at address: 0x200000385140 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:07:13.096 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:13.096 element at address: 0x200000381900 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:13.096 element at address: 0x200000381680 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:07:13.096 element at address: 0x20000037e000 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:13.096 element at address: 0x20000037de40 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:13.096 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:07:13.096 element at address: 0x20000037a540 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:13.096 element at address: 0x20000037a380 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:13.096 element at address: 0x20000037a100 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:07:13.096 element at address: 0x200000376a80 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:13.096 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:13.096 element at address: 0x200000376640 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:07:13.096 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:13.096 element at address: 0x200000372e00 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:13.096 element at address: 0x200000372b80 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:07:13.096 element at address: 0x20000036f500 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:13.096 element at address: 0x20000036f340 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:13.096 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:07:13.096 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:13.096 element at address: 0x20000036b880 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:13.096 element at address: 0x20000036b600 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:07:13.096 element at address: 0x200000367f80 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:13.096 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:13.096 element at address: 0x200000367b40 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:07:13.096 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:13.096 element at address: 0x200000364300 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:13.096 element at address: 0x200000364080 with size: 0.000244 MiB 00:07:13.096 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:07:13.096 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:07:13.096 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:13.096 10:33:48 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:13.096 10:33:48 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1982854 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 1982854 ']' 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 1982854 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1982854 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1982854' 00:07:13.096 killing process with pid 1982854 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 1982854 00:07:13.096 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 1982854 00:07:13.664 00:07:13.664 real 0m1.651s 00:07:13.664 user 0m1.733s 00:07:13.664 sys 0m0.534s 00:07:13.664 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:13.664 10:33:48 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:13.664 ************************************ 00:07:13.664 END TEST dpdk_mem_utility 00:07:13.664 ************************************ 00:07:13.664 10:33:48 -- common/autotest_common.sh@1142 -- # return 0 00:07:13.664 10:33:48 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:13.664 10:33:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:13.664 10:33:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:13.664 10:33:48 -- common/autotest_common.sh@10 -- # set +x 00:07:13.664 ************************************ 00:07:13.664 START TEST event 00:07:13.664 ************************************ 00:07:13.664 10:33:48 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:13.664 * Looking for test storage... 00:07:13.664 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:13.664 10:33:48 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:13.664 10:33:48 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:13.664 10:33:48 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:13.664 10:33:48 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:13.664 10:33:48 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:13.664 10:33:48 event -- common/autotest_common.sh@10 -- # set +x 00:07:13.664 ************************************ 00:07:13.664 START TEST event_perf 00:07:13.664 ************************************ 00:07:13.664 10:33:48 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:13.664 Running I/O for 1 seconds...[2024-07-12 10:33:48.818233] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:13.664 [2024-07-12 10:33:48.818293] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1983253 ] 00:07:13.922 [2024-07-12 10:33:48.946328] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:13.922 [2024-07-12 10:33:49.046981] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:13.922 [2024-07-12 10:33:49.047065] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:13.922 [2024-07-12 10:33:49.047140] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:13.922 [2024-07-12 10:33:49.047143] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.294 Running I/O for 1 seconds... 00:07:15.294 lcore 0: 178234 00:07:15.294 lcore 1: 178234 00:07:15.294 lcore 2: 178232 00:07:15.294 lcore 3: 178233 00:07:15.294 done. 00:07:15.294 00:07:15.294 real 0m1.346s 00:07:15.294 user 0m4.203s 00:07:15.294 sys 0m0.137s 00:07:15.294 10:33:50 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:15.294 10:33:50 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:15.294 ************************************ 00:07:15.294 END TEST event_perf 00:07:15.294 ************************************ 00:07:15.294 10:33:50 event -- common/autotest_common.sh@1142 -- # return 0 00:07:15.294 10:33:50 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:15.294 10:33:50 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:15.294 10:33:50 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:15.294 10:33:50 event -- common/autotest_common.sh@10 -- # set +x 00:07:15.294 ************************************ 00:07:15.294 START TEST event_reactor 00:07:15.294 ************************************ 00:07:15.294 10:33:50 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:15.294 [2024-07-12 10:33:50.229778] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:15.294 [2024-07-12 10:33:50.229836] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1983452 ] 00:07:15.294 [2024-07-12 10:33:50.356059] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.294 [2024-07-12 10:33:50.456401] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.664 test_start 00:07:16.664 oneshot 00:07:16.664 tick 100 00:07:16.664 tick 100 00:07:16.664 tick 250 00:07:16.664 tick 100 00:07:16.664 tick 100 00:07:16.664 tick 100 00:07:16.664 tick 250 00:07:16.664 tick 500 00:07:16.664 tick 100 00:07:16.664 tick 100 00:07:16.664 tick 250 00:07:16.664 tick 100 00:07:16.664 tick 100 00:07:16.664 test_end 00:07:16.664 00:07:16.664 real 0m1.331s 00:07:16.664 user 0m1.188s 00:07:16.664 sys 0m0.136s 00:07:16.664 10:33:51 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.664 10:33:51 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:16.664 ************************************ 00:07:16.664 END TEST event_reactor 00:07:16.664 ************************************ 00:07:16.664 10:33:51 event -- common/autotest_common.sh@1142 -- # return 0 00:07:16.664 10:33:51 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:16.664 10:33:51 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:16.664 10:33:51 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.664 10:33:51 event -- common/autotest_common.sh@10 -- # set +x 00:07:16.664 ************************************ 00:07:16.664 START TEST event_reactor_perf 00:07:16.664 ************************************ 00:07:16.664 10:33:51 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:16.664 [2024-07-12 10:33:51.650895] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:16.664 [2024-07-12 10:33:51.650953] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1983653 ] 00:07:16.664 [2024-07-12 10:33:51.777834] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.923 [2024-07-12 10:33:51.878998] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.859 test_start 00:07:17.859 test_end 00:07:17.859 Performance: 327044 events per second 00:07:17.859 00:07:17.859 real 0m1.349s 00:07:17.859 user 0m1.201s 00:07:17.859 sys 0m0.142s 00:07:17.859 10:33:52 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:17.859 10:33:52 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:17.859 ************************************ 00:07:17.859 END TEST event_reactor_perf 00:07:17.859 ************************************ 00:07:17.859 10:33:53 event -- common/autotest_common.sh@1142 -- # return 0 00:07:17.859 10:33:53 event -- event/event.sh@49 -- # uname -s 00:07:17.859 10:33:53 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:17.859 10:33:53 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:17.859 10:33:53 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:17.859 10:33:53 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.859 10:33:53 event -- common/autotest_common.sh@10 -- # set +x 00:07:18.117 ************************************ 00:07:18.117 START TEST event_scheduler 00:07:18.117 ************************************ 00:07:18.117 10:33:53 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:18.117 * Looking for test storage... 00:07:18.117 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:18.117 10:33:53 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:18.117 10:33:53 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1983873 00:07:18.117 10:33:53 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:18.117 10:33:53 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:18.117 10:33:53 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1983873 00:07:18.117 10:33:53 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 1983873 ']' 00:07:18.117 10:33:53 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:18.117 10:33:53 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:18.117 10:33:53 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:18.117 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:18.117 10:33:53 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:18.117 10:33:53 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.117 [2024-07-12 10:33:53.220969] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:18.117 [2024-07-12 10:33:53.221045] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1983873 ] 00:07:18.375 [2024-07-12 10:33:53.325803] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:18.375 [2024-07-12 10:33:53.410534] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.375 [2024-07-12 10:33:53.410560] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:18.375 [2024-07-12 10:33:53.410639] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:18.375 [2024-07-12 10:33:53.410640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:18.979 10:33:54 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:18.979 10:33:54 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:07:18.979 10:33:54 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:18.979 10:33:54 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:18.979 10:33:54 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.979 [2024-07-12 10:33:54.173527] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:18.979 [2024-07-12 10:33:54.173554] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:18.979 [2024-07-12 10:33:54.173565] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:18.979 [2024-07-12 10:33:54.173573] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:18.979 [2024-07-12 10:33:54.173581] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:19.265 10:33:54 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:19.265 10:33:54 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 [2024-07-12 10:33:54.266293] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:19.265 10:33:54 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:19.265 10:33:54 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:19.265 10:33:54 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 ************************************ 00:07:19.265 START TEST scheduler_create_thread 00:07:19.265 ************************************ 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 2 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 3 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 4 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 5 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 6 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 7 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 8 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 9 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 10 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.265 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.831 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.831 10:33:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:19.831 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.831 10:33:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.205 10:33:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.205 10:33:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:21.205 10:33:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:21.205 10:33:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:21.205 10:33:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:22.583 10:33:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:22.583 00:07:22.583 real 0m3.100s 00:07:22.583 user 0m0.024s 00:07:22.583 sys 0m0.007s 00:07:22.583 10:33:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:22.583 10:33:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:22.583 ************************************ 00:07:22.583 END TEST scheduler_create_thread 00:07:22.583 ************************************ 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:07:22.583 10:33:57 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:22.583 10:33:57 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1983873 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 1983873 ']' 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 1983873 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1983873 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1983873' 00:07:22.583 killing process with pid 1983873 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 1983873 00:07:22.583 10:33:57 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 1983873 00:07:22.841 [2024-07-12 10:33:57.789903] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:22.841 00:07:22.841 real 0m4.966s 00:07:22.841 user 0m9.783s 00:07:22.841 sys 0m0.483s 00:07:22.841 10:33:58 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:22.841 10:33:58 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:22.841 ************************************ 00:07:22.841 END TEST event_scheduler 00:07:22.841 ************************************ 00:07:23.099 10:33:58 event -- common/autotest_common.sh@1142 -- # return 0 00:07:23.099 10:33:58 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:23.099 10:33:58 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:23.099 10:33:58 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:23.099 10:33:58 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.099 10:33:58 event -- common/autotest_common.sh@10 -- # set +x 00:07:23.099 ************************************ 00:07:23.099 START TEST app_repeat 00:07:23.099 ************************************ 00:07:23.099 10:33:58 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:07:23.099 10:33:58 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.099 10:33:58 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.099 10:33:58 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:23.099 10:33:58 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.099 10:33:58 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:23.099 10:33:58 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:23.099 10:33:58 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:23.100 10:33:58 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1984624 00:07:23.100 10:33:58 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:23.100 10:33:58 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:23.100 10:33:58 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1984624' 00:07:23.100 Process app_repeat pid: 1984624 00:07:23.100 10:33:58 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:23.100 10:33:58 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:23.100 spdk_app_start Round 0 00:07:23.100 10:33:58 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1984624 /var/tmp/spdk-nbd.sock 00:07:23.100 10:33:58 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1984624 ']' 00:07:23.100 10:33:58 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:23.100 10:33:58 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:23.100 10:33:58 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:23.100 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:23.100 10:33:58 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:23.100 10:33:58 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:23.100 [2024-07-12 10:33:58.149144] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:23.100 [2024-07-12 10:33:58.149196] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1984624 ] 00:07:23.100 [2024-07-12 10:33:58.263612] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:23.358 [2024-07-12 10:33:58.367394] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.358 [2024-07-12 10:33:58.367400] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.925 10:33:59 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:23.925 10:33:59 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:23.925 10:33:59 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:24.183 Malloc0 00:07:24.183 10:33:59 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:24.442 Malloc1 00:07:24.442 10:33:59 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.442 10:33:59 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:24.700 /dev/nbd0 00:07:24.700 10:33:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:24.700 10:33:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:24.700 1+0 records in 00:07:24.700 1+0 records out 00:07:24.700 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252456 s, 16.2 MB/s 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:24.700 10:33:59 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:24.700 10:33:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.700 10:33:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.700 10:33:59 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:24.959 /dev/nbd1 00:07:24.959 10:34:00 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:24.959 10:34:00 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:24.959 1+0 records in 00:07:24.959 1+0 records out 00:07:24.959 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000216926 s, 18.9 MB/s 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:24.959 10:34:00 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:24.959 10:34:00 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.959 10:34:00 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.959 10:34:00 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.959 10:34:00 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.959 10:34:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:25.217 10:34:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:25.217 { 00:07:25.217 "nbd_device": "/dev/nbd0", 00:07:25.217 "bdev_name": "Malloc0" 00:07:25.217 }, 00:07:25.217 { 00:07:25.217 "nbd_device": "/dev/nbd1", 00:07:25.217 "bdev_name": "Malloc1" 00:07:25.217 } 00:07:25.217 ]' 00:07:25.217 10:34:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:25.217 { 00:07:25.217 "nbd_device": "/dev/nbd0", 00:07:25.217 "bdev_name": "Malloc0" 00:07:25.217 }, 00:07:25.217 { 00:07:25.217 "nbd_device": "/dev/nbd1", 00:07:25.217 "bdev_name": "Malloc1" 00:07:25.217 } 00:07:25.217 ]' 00:07:25.217 10:34:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:25.476 /dev/nbd1' 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:25.476 /dev/nbd1' 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:25.476 10:34:00 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:25.477 256+0 records in 00:07:25.477 256+0 records out 00:07:25.477 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105164 s, 99.7 MB/s 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:25.477 256+0 records in 00:07:25.477 256+0 records out 00:07:25.477 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0298407 s, 35.1 MB/s 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:25.477 256+0 records in 00:07:25.477 256+0 records out 00:07:25.477 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0229475 s, 45.7 MB/s 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:25.477 10:34:00 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:25.735 10:34:00 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.993 10:34:01 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:26.251 10:34:01 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:26.251 10:34:01 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:26.508 10:34:01 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:26.766 [2024-07-12 10:34:01.878547] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:27.025 [2024-07-12 10:34:01.978381] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:27.025 [2024-07-12 10:34:01.978387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.025 [2024-07-12 10:34:02.030659] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:27.025 [2024-07-12 10:34:02.030709] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:29.560 10:34:04 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:29.560 10:34:04 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:29.560 spdk_app_start Round 1 00:07:29.560 10:34:04 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1984624 /var/tmp/spdk-nbd.sock 00:07:29.560 10:34:04 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1984624 ']' 00:07:29.560 10:34:04 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:29.560 10:34:04 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:29.560 10:34:04 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:29.560 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:29.560 10:34:04 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:29.560 10:34:04 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:29.818 10:34:04 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:29.818 10:34:04 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:29.818 10:34:04 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:30.076 Malloc0 00:07:30.076 10:34:05 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:30.334 Malloc1 00:07:30.334 10:34:05 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:30.334 10:34:05 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:30.592 /dev/nbd0 00:07:30.592 10:34:05 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:30.592 10:34:05 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:30.592 1+0 records in 00:07:30.592 1+0 records out 00:07:30.592 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225132 s, 18.2 MB/s 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:30.592 10:34:05 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:30.592 10:34:05 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:30.592 10:34:05 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:30.592 10:34:05 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:30.850 /dev/nbd1 00:07:30.850 10:34:05 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:30.850 10:34:05 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:30.850 1+0 records in 00:07:30.850 1+0 records out 00:07:30.850 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242641 s, 16.9 MB/s 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:30.850 10:34:05 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:30.850 10:34:05 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:30.850 10:34:05 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:30.850 10:34:05 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:30.850 10:34:05 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.850 10:34:05 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:31.108 { 00:07:31.108 "nbd_device": "/dev/nbd0", 00:07:31.108 "bdev_name": "Malloc0" 00:07:31.108 }, 00:07:31.108 { 00:07:31.108 "nbd_device": "/dev/nbd1", 00:07:31.108 "bdev_name": "Malloc1" 00:07:31.108 } 00:07:31.108 ]' 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:31.108 { 00:07:31.108 "nbd_device": "/dev/nbd0", 00:07:31.108 "bdev_name": "Malloc0" 00:07:31.108 }, 00:07:31.108 { 00:07:31.108 "nbd_device": "/dev/nbd1", 00:07:31.108 "bdev_name": "Malloc1" 00:07:31.108 } 00:07:31.108 ]' 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:31.108 /dev/nbd1' 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:31.108 /dev/nbd1' 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:31.108 256+0 records in 00:07:31.108 256+0 records out 00:07:31.108 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00670461 s, 156 MB/s 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:31.108 10:34:06 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:31.366 256+0 records in 00:07:31.366 256+0 records out 00:07:31.366 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0184969 s, 56.7 MB/s 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:31.366 256+0 records in 00:07:31.366 256+0 records out 00:07:31.366 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211506 s, 49.6 MB/s 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.366 10:34:06 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.624 10:34:06 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:31.883 10:34:06 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:32.142 10:34:07 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:32.142 10:34:07 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:32.401 10:34:07 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:32.659 [2024-07-12 10:34:07.714127] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:32.659 [2024-07-12 10:34:07.812461] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:32.659 [2024-07-12 10:34:07.812466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.917 [2024-07-12 10:34:07.859873] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:32.917 [2024-07-12 10:34:07.859916] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:35.445 10:34:10 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:35.445 10:34:10 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:35.445 spdk_app_start Round 2 00:07:35.445 10:34:10 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1984624 /var/tmp/spdk-nbd.sock 00:07:35.445 10:34:10 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1984624 ']' 00:07:35.445 10:34:10 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:35.445 10:34:10 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:35.445 10:34:10 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:35.445 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:35.445 10:34:10 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:35.445 10:34:10 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:35.733 10:34:10 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:35.733 10:34:10 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:35.733 10:34:10 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:35.992 Malloc0 00:07:35.992 10:34:11 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:36.251 Malloc1 00:07:36.251 10:34:11 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:36.251 10:34:11 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:36.511 /dev/nbd0 00:07:36.511 10:34:11 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:36.511 10:34:11 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:36.511 1+0 records in 00:07:36.511 1+0 records out 00:07:36.511 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255794 s, 16.0 MB/s 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:36.511 10:34:11 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:36.511 10:34:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:36.511 10:34:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:36.511 10:34:11 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:36.769 /dev/nbd1 00:07:36.769 10:34:11 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:36.769 10:34:11 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:36.769 1+0 records in 00:07:36.769 1+0 records out 00:07:36.769 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268923 s, 15.2 MB/s 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:36.769 10:34:11 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:36.769 10:34:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:36.769 10:34:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:36.769 10:34:11 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:36.769 10:34:11 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:36.769 10:34:11 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:37.027 10:34:12 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:37.027 { 00:07:37.027 "nbd_device": "/dev/nbd0", 00:07:37.027 "bdev_name": "Malloc0" 00:07:37.027 }, 00:07:37.027 { 00:07:37.027 "nbd_device": "/dev/nbd1", 00:07:37.027 "bdev_name": "Malloc1" 00:07:37.027 } 00:07:37.027 ]' 00:07:37.027 10:34:12 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:37.028 { 00:07:37.028 "nbd_device": "/dev/nbd0", 00:07:37.028 "bdev_name": "Malloc0" 00:07:37.028 }, 00:07:37.028 { 00:07:37.028 "nbd_device": "/dev/nbd1", 00:07:37.028 "bdev_name": "Malloc1" 00:07:37.028 } 00:07:37.028 ]' 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:37.028 /dev/nbd1' 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:37.028 /dev/nbd1' 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:37.028 256+0 records in 00:07:37.028 256+0 records out 00:07:37.028 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0111692 s, 93.9 MB/s 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:37.028 256+0 records in 00:07:37.028 256+0 records out 00:07:37.028 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0187057 s, 56.1 MB/s 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:37.028 256+0 records in 00:07:37.028 256+0 records out 00:07:37.028 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197751 s, 53.0 MB/s 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.028 10:34:12 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:37.285 10:34:12 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:37.542 10:34:12 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:37.799 10:34:12 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:38.057 10:34:13 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:38.057 10:34:13 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:38.315 10:34:13 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:38.574 [2024-07-12 10:34:13.603749] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:38.574 [2024-07-12 10:34:13.702286] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.574 [2024-07-12 10:34:13.702291] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.574 [2024-07-12 10:34:13.753293] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:38.574 [2024-07-12 10:34:13.753344] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:41.856 10:34:16 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1984624 /var/tmp/spdk-nbd.sock 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1984624 ']' 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:41.856 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:41.856 10:34:16 event.app_repeat -- event/event.sh@39 -- # killprocess 1984624 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 1984624 ']' 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 1984624 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1984624 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1984624' 00:07:41.856 killing process with pid 1984624 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@967 -- # kill 1984624 00:07:41.856 10:34:16 event.app_repeat -- common/autotest_common.sh@972 -- # wait 1984624 00:07:41.856 spdk_app_start is called in Round 0. 00:07:41.856 Shutdown signal received, stop current app iteration 00:07:41.856 Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 reinitialization... 00:07:41.856 spdk_app_start is called in Round 1. 00:07:41.856 Shutdown signal received, stop current app iteration 00:07:41.856 Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 reinitialization... 00:07:41.856 spdk_app_start is called in Round 2. 00:07:41.857 Shutdown signal received, stop current app iteration 00:07:41.857 Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 reinitialization... 00:07:41.857 spdk_app_start is called in Round 3. 00:07:41.857 Shutdown signal received, stop current app iteration 00:07:41.857 10:34:16 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:41.857 10:34:16 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:41.857 00:07:41.857 real 0m18.747s 00:07:41.857 user 0m40.642s 00:07:41.857 sys 0m3.851s 00:07:41.857 10:34:16 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.857 10:34:16 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:41.857 ************************************ 00:07:41.857 END TEST app_repeat 00:07:41.857 ************************************ 00:07:41.857 10:34:16 event -- common/autotest_common.sh@1142 -- # return 0 00:07:41.857 10:34:16 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:41.857 00:07:41.857 real 0m28.251s 00:07:41.857 user 0m57.197s 00:07:41.857 sys 0m5.119s 00:07:41.857 10:34:16 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.857 10:34:16 event -- common/autotest_common.sh@10 -- # set +x 00:07:41.857 ************************************ 00:07:41.857 END TEST event 00:07:41.857 ************************************ 00:07:41.857 10:34:16 -- common/autotest_common.sh@1142 -- # return 0 00:07:41.857 10:34:16 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:41.857 10:34:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:41.857 10:34:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.857 10:34:16 -- common/autotest_common.sh@10 -- # set +x 00:07:41.857 ************************************ 00:07:41.857 START TEST thread 00:07:41.857 ************************************ 00:07:41.857 10:34:16 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:42.114 * Looking for test storage... 00:07:42.114 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:42.114 10:34:17 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:42.114 10:34:17 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:42.115 10:34:17 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:42.115 10:34:17 thread -- common/autotest_common.sh@10 -- # set +x 00:07:42.115 ************************************ 00:07:42.115 START TEST thread_poller_perf 00:07:42.115 ************************************ 00:07:42.115 10:34:17 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:42.115 [2024-07-12 10:34:17.144297] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:42.115 [2024-07-12 10:34:17.144363] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1987327 ] 00:07:42.115 [2024-07-12 10:34:17.277537] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.372 [2024-07-12 10:34:17.379024] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.372 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:43.304 ====================================== 00:07:43.304 busy:2307107180 (cyc) 00:07:43.304 total_run_count: 266000 00:07:43.304 tsc_hz: 2300000000 (cyc) 00:07:43.304 ====================================== 00:07:43.304 poller_cost: 8673 (cyc), 3770 (nsec) 00:07:43.304 00:07:43.304 real 0m1.365s 00:07:43.304 user 0m1.218s 00:07:43.304 sys 0m0.140s 00:07:43.304 10:34:18 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:43.304 10:34:18 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:43.304 ************************************ 00:07:43.304 END TEST thread_poller_perf 00:07:43.304 ************************************ 00:07:43.561 10:34:18 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:43.561 10:34:18 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:43.561 10:34:18 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:43.561 10:34:18 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:43.561 10:34:18 thread -- common/autotest_common.sh@10 -- # set +x 00:07:43.561 ************************************ 00:07:43.561 START TEST thread_poller_perf 00:07:43.561 ************************************ 00:07:43.561 10:34:18 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:43.561 [2024-07-12 10:34:18.589240] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:43.561 [2024-07-12 10:34:18.589304] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1987522 ] 00:07:43.561 [2024-07-12 10:34:18.703023] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.819 [2024-07-12 10:34:18.808973] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.819 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:44.752 ====================================== 00:07:44.752 busy:2302427260 (cyc) 00:07:44.752 total_run_count: 3495000 00:07:44.752 tsc_hz: 2300000000 (cyc) 00:07:44.752 ====================================== 00:07:44.752 poller_cost: 658 (cyc), 286 (nsec) 00:07:44.752 00:07:44.752 real 0m1.339s 00:07:44.752 user 0m1.206s 00:07:44.752 sys 0m0.127s 00:07:44.752 10:34:19 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.752 10:34:19 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:44.752 ************************************ 00:07:44.752 END TEST thread_poller_perf 00:07:44.752 ************************************ 00:07:44.752 10:34:19 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:44.752 10:34:19 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:44.752 00:07:44.752 real 0m2.965s 00:07:44.752 user 0m2.527s 00:07:44.752 sys 0m0.449s 00:07:44.752 10:34:19 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.752 10:34:19 thread -- common/autotest_common.sh@10 -- # set +x 00:07:44.752 ************************************ 00:07:44.752 END TEST thread 00:07:44.752 ************************************ 00:07:45.010 10:34:19 -- common/autotest_common.sh@1142 -- # return 0 00:07:45.010 10:34:19 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:45.010 10:34:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:45.010 10:34:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:45.010 10:34:19 -- common/autotest_common.sh@10 -- # set +x 00:07:45.010 ************************************ 00:07:45.010 START TEST accel 00:07:45.010 ************************************ 00:07:45.010 10:34:20 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:45.010 * Looking for test storage... 00:07:45.010 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:45.010 10:34:20 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:45.010 10:34:20 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:45.010 10:34:20 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:45.010 10:34:20 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1987763 00:07:45.010 10:34:20 accel -- accel/accel.sh@63 -- # waitforlisten 1987763 00:07:45.010 10:34:20 accel -- common/autotest_common.sh@829 -- # '[' -z 1987763 ']' 00:07:45.010 10:34:20 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:45.010 10:34:20 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:45.010 10:34:20 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:45.010 10:34:20 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:45.010 10:34:20 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:45.010 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:45.010 10:34:20 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:45.010 10:34:20 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:45.010 10:34:20 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.010 10:34:20 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:45.010 10:34:20 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.010 10:34:20 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.010 10:34:20 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:45.010 10:34:20 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:45.010 10:34:20 accel -- accel/accel.sh@41 -- # jq -r . 00:07:45.268 [2024-07-12 10:34:20.209141] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:45.268 [2024-07-12 10:34:20.209208] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1987763 ] 00:07:45.268 [2024-07-12 10:34:20.336948] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.268 [2024-07-12 10:34:20.438100] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.202 10:34:21 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:46.202 10:34:21 accel -- common/autotest_common.sh@862 -- # return 0 00:07:46.202 10:34:21 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:46.202 10:34:21 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:46.202 10:34:21 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:46.202 10:34:21 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:46.202 10:34:21 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:46.202 10:34:21 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:46.202 10:34:21 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:46.202 10:34:21 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:46.202 10:34:21 accel -- common/autotest_common.sh@10 -- # set +x 00:07:46.202 10:34:21 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.202 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.202 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.202 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.203 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.203 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.203 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.203 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.203 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.203 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.203 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.203 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.203 10:34:21 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:46.203 10:34:21 accel -- accel/accel.sh@72 -- # IFS== 00:07:46.203 10:34:21 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:46.203 10:34:21 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:46.203 10:34:21 accel -- accel/accel.sh@75 -- # killprocess 1987763 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@948 -- # '[' -z 1987763 ']' 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@952 -- # kill -0 1987763 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@953 -- # uname 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1987763 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1987763' 00:07:46.203 killing process with pid 1987763 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@967 -- # kill 1987763 00:07:46.203 10:34:21 accel -- common/autotest_common.sh@972 -- # wait 1987763 00:07:46.460 10:34:21 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:46.460 10:34:21 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:46.460 10:34:21 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:46.460 10:34:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.460 10:34:21 accel -- common/autotest_common.sh@10 -- # set +x 00:07:46.460 10:34:21 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:07:46.460 10:34:21 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:46.460 10:34:21 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:46.461 10:34:21 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:46.461 10:34:21 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:46.461 10:34:21 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.461 10:34:21 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.461 10:34:21 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:46.461 10:34:21 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:46.461 10:34:21 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:46.461 10:34:21 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.461 10:34:21 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:46.719 10:34:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:46.719 10:34:21 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:46.719 10:34:21 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:46.719 10:34:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.719 10:34:21 accel -- common/autotest_common.sh@10 -- # set +x 00:07:46.719 ************************************ 00:07:46.719 START TEST accel_missing_filename 00:07:46.719 ************************************ 00:07:46.719 10:34:21 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:07:46.719 10:34:21 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:46.719 10:34:21 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:46.719 10:34:21 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:46.719 10:34:21 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:46.719 10:34:21 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:46.719 10:34:21 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:46.719 10:34:21 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:46.719 10:34:21 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:46.719 10:34:21 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:46.719 10:34:21 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:46.719 10:34:21 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:46.719 10:34:21 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.719 10:34:21 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.719 10:34:21 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:46.719 10:34:21 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:46.719 10:34:21 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:46.719 [2024-07-12 10:34:21.729857] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:46.719 [2024-07-12 10:34:21.729921] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1987979 ] 00:07:46.719 [2024-07-12 10:34:21.856781] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.977 [2024-07-12 10:34:21.957681] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.977 [2024-07-12 10:34:22.032558] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:46.977 [2024-07-12 10:34:22.106622] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:47.234 A filename is required. 00:07:47.234 10:34:22 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:47.234 10:34:22 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:47.234 10:34:22 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:47.234 10:34:22 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:47.234 10:34:22 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:47.234 10:34:22 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:47.234 00:07:47.234 real 0m0.499s 00:07:47.234 user 0m0.333s 00:07:47.234 sys 0m0.190s 00:07:47.234 10:34:22 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.234 10:34:22 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:47.234 ************************************ 00:07:47.234 END TEST accel_missing_filename 00:07:47.234 ************************************ 00:07:47.234 10:34:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.234 10:34:22 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:47.234 10:34:22 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:47.234 10:34:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.234 10:34:22 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.234 ************************************ 00:07:47.234 START TEST accel_compress_verify 00:07:47.234 ************************************ 00:07:47.234 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:47.234 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:47.234 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:47.234 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:47.234 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:47.234 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:47.234 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:47.234 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:47.234 10:34:22 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:47.234 10:34:22 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:47.234 10:34:22 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.234 10:34:22 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.234 10:34:22 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.234 10:34:22 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.234 10:34:22 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:47.234 10:34:22 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:47.234 10:34:22 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:47.234 [2024-07-12 10:34:22.320669] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:47.234 [2024-07-12 10:34:22.320731] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1988123 ] 00:07:47.492 [2024-07-12 10:34:22.449500] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.492 [2024-07-12 10:34:22.549601] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.492 [2024-07-12 10:34:22.617992] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:47.749 [2024-07-12 10:34:22.691845] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:47.749 00:07:47.749 Compression does not support the verify option, aborting. 00:07:47.749 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:47.749 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:47.749 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:47.749 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:47.749 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:47.749 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:47.749 00:07:47.749 real 0m0.502s 00:07:47.749 user 0m0.328s 00:07:47.749 sys 0m0.203s 00:07:47.749 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.749 10:34:22 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:47.749 ************************************ 00:07:47.749 END TEST accel_compress_verify 00:07:47.749 ************************************ 00:07:47.749 10:34:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.749 10:34:22 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:47.749 10:34:22 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:47.749 10:34:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.749 10:34:22 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.749 ************************************ 00:07:47.749 START TEST accel_wrong_workload 00:07:47.749 ************************************ 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:47.749 10:34:22 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:47.749 10:34:22 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:47.749 10:34:22 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.749 10:34:22 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.749 10:34:22 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.749 10:34:22 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.749 10:34:22 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:47.749 10:34:22 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:47.749 10:34:22 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:47.749 Unsupported workload type: foobar 00:07:47.749 [2024-07-12 10:34:22.898522] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:47.749 accel_perf options: 00:07:47.749 [-h help message] 00:07:47.749 [-q queue depth per core] 00:07:47.749 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:47.749 [-T number of threads per core 00:07:47.749 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:47.749 [-t time in seconds] 00:07:47.749 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:47.749 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:47.749 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:47.749 [-l for compress/decompress workloads, name of uncompressed input file 00:07:47.749 [-S for crc32c workload, use this seed value (default 0) 00:07:47.749 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:47.749 [-f for fill workload, use this BYTE value (default 255) 00:07:47.749 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:47.749 [-y verify result if this switch is on] 00:07:47.749 [-a tasks to allocate per core (default: same value as -q)] 00:07:47.749 Can be used to spread operations across a wider range of memory. 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:47.749 00:07:47.749 real 0m0.043s 00:07:47.749 user 0m0.026s 00:07:47.749 sys 0m0.017s 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.749 10:34:22 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:47.749 ************************************ 00:07:47.749 END TEST accel_wrong_workload 00:07:47.749 ************************************ 00:07:47.749 Error: writing output failed: Broken pipe 00:07:47.749 10:34:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.749 10:34:22 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:47.749 10:34:22 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:47.749 10:34:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.749 10:34:22 accel -- common/autotest_common.sh@10 -- # set +x 00:07:48.006 ************************************ 00:07:48.006 START TEST accel_negative_buffers 00:07:48.006 ************************************ 00:07:48.006 10:34:22 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:48.006 10:34:22 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:48.006 10:34:22 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:48.006 10:34:22 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:48.006 10:34:22 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:48.006 10:34:22 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:48.006 10:34:22 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:48.006 10:34:22 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:48.006 10:34:22 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:48.006 10:34:22 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:48.006 10:34:22 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:48.006 10:34:22 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:48.006 10:34:22 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:48.006 10:34:22 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:48.006 10:34:22 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:48.006 10:34:22 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:48.006 10:34:22 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:48.006 -x option must be non-negative. 00:07:48.006 [2024-07-12 10:34:23.016890] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:48.006 accel_perf options: 00:07:48.006 [-h help message] 00:07:48.006 [-q queue depth per core] 00:07:48.006 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:48.006 [-T number of threads per core 00:07:48.006 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:48.006 [-t time in seconds] 00:07:48.006 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:48.006 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:48.006 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:48.006 [-l for compress/decompress workloads, name of uncompressed input file 00:07:48.006 [-S for crc32c workload, use this seed value (default 0) 00:07:48.006 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:48.006 [-f for fill workload, use this BYTE value (default 255) 00:07:48.006 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:48.006 [-y verify result if this switch is on] 00:07:48.006 [-a tasks to allocate per core (default: same value as -q)] 00:07:48.006 Can be used to spread operations across a wider range of memory. 00:07:48.006 10:34:23 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:48.006 10:34:23 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:48.006 10:34:23 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:48.006 10:34:23 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:48.006 00:07:48.006 real 0m0.044s 00:07:48.006 user 0m0.021s 00:07:48.006 sys 0m0.023s 00:07:48.006 10:34:23 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:48.006 10:34:23 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:48.006 ************************************ 00:07:48.006 END TEST accel_negative_buffers 00:07:48.006 ************************************ 00:07:48.006 Error: writing output failed: Broken pipe 00:07:48.006 10:34:23 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:48.006 10:34:23 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:48.006 10:34:23 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:48.006 10:34:23 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:48.006 10:34:23 accel -- common/autotest_common.sh@10 -- # set +x 00:07:48.006 ************************************ 00:07:48.006 START TEST accel_crc32c 00:07:48.006 ************************************ 00:07:48.006 10:34:23 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:48.006 10:34:23 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:48.006 [2024-07-12 10:34:23.115692] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:48.006 [2024-07-12 10:34:23.115749] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1988236 ] 00:07:48.264 [2024-07-12 10:34:23.245007] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.264 [2024-07-12 10:34:23.345808] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.264 10:34:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:49.633 10:34:24 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:49.633 00:07:49.633 real 0m1.508s 00:07:49.633 user 0m1.314s 00:07:49.633 sys 0m0.198s 00:07:49.633 10:34:24 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:49.633 10:34:24 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:49.633 ************************************ 00:07:49.633 END TEST accel_crc32c 00:07:49.633 ************************************ 00:07:49.633 10:34:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:49.633 10:34:24 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:49.633 10:34:24 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:49.633 10:34:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:49.633 10:34:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:49.633 ************************************ 00:07:49.633 START TEST accel_crc32c_C2 00:07:49.633 ************************************ 00:07:49.633 10:34:24 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:49.633 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:49.633 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:49.633 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:49.634 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:49.634 [2024-07-12 10:34:24.688329] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:49.634 [2024-07-12 10:34:24.688386] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1988432 ] 00:07:49.634 [2024-07-12 10:34:24.818427] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.892 [2024-07-12 10:34:24.920008] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.892 10:34:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.892 10:34:25 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:51.267 00:07:51.267 real 0m1.519s 00:07:51.267 user 0m1.321s 00:07:51.267 sys 0m0.200s 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:51.267 10:34:26 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:51.267 ************************************ 00:07:51.267 END TEST accel_crc32c_C2 00:07:51.267 ************************************ 00:07:51.267 10:34:26 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:51.267 10:34:26 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:51.267 10:34:26 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:51.267 10:34:26 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:51.267 10:34:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:51.267 ************************************ 00:07:51.267 START TEST accel_copy 00:07:51.267 ************************************ 00:07:51.267 10:34:26 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:51.267 10:34:26 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:51.267 [2024-07-12 10:34:26.263153] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:51.267 [2024-07-12 10:34:26.263210] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1988679 ] 00:07:51.267 [2024-07-12 10:34:26.392771] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.526 [2024-07-12 10:34:26.494723] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:51.526 10:34:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:52.533 10:34:27 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:52.792 10:34:27 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:52.792 00:07:52.792 real 0m1.496s 00:07:52.792 user 0m1.302s 00:07:52.792 sys 0m0.200s 00:07:52.792 10:34:27 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:52.792 10:34:27 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:52.792 ************************************ 00:07:52.792 END TEST accel_copy 00:07:52.792 ************************************ 00:07:52.792 10:34:27 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:52.792 10:34:27 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:52.792 10:34:27 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:52.792 10:34:27 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:52.792 10:34:27 accel -- common/autotest_common.sh@10 -- # set +x 00:07:52.792 ************************************ 00:07:52.792 START TEST accel_fill 00:07:52.792 ************************************ 00:07:52.792 10:34:27 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:52.792 10:34:27 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:52.792 [2024-07-12 10:34:27.844683] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:52.792 [2024-07-12 10:34:27.844743] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1988974 ] 00:07:52.792 [2024-07-12 10:34:27.975605] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.051 [2024-07-12 10:34:28.073985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.051 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:53.052 10:34:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:54.426 10:34:29 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:54.427 10:34:29 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:54.427 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:54.427 10:34:29 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:54.427 10:34:29 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:54.427 10:34:29 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:54.427 10:34:29 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:54.427 00:07:54.427 real 0m1.494s 00:07:54.427 user 0m1.313s 00:07:54.427 sys 0m0.186s 00:07:54.427 10:34:29 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:54.427 10:34:29 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:54.427 ************************************ 00:07:54.427 END TEST accel_fill 00:07:54.427 ************************************ 00:07:54.427 10:34:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:54.427 10:34:29 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:54.427 10:34:29 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:54.427 10:34:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.427 10:34:29 accel -- common/autotest_common.sh@10 -- # set +x 00:07:54.427 ************************************ 00:07:54.427 START TEST accel_copy_crc32c 00:07:54.427 ************************************ 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:54.427 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:54.427 [2024-07-12 10:34:29.399293] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:54.427 [2024-07-12 10:34:29.399350] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1989185 ] 00:07:54.427 [2024-07-12 10:34:29.526746] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.686 [2024-07-12 10:34:29.626388] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.686 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:54.687 10:34:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:56.065 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:56.066 00:07:56.066 real 0m1.500s 00:07:56.066 user 0m1.294s 00:07:56.066 sys 0m0.210s 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:56.066 10:34:30 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:56.066 ************************************ 00:07:56.066 END TEST accel_copy_crc32c 00:07:56.066 ************************************ 00:07:56.066 10:34:30 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:56.066 10:34:30 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:56.066 10:34:30 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:56.066 10:34:30 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:56.066 10:34:30 accel -- common/autotest_common.sh@10 -- # set +x 00:07:56.066 ************************************ 00:07:56.066 START TEST accel_copy_crc32c_C2 00:07:56.066 ************************************ 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:56.066 10:34:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:56.066 [2024-07-12 10:34:30.977971] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:56.066 [2024-07-12 10:34:30.978030] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1989379 ] 00:07:56.066 [2024-07-12 10:34:31.095937] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.066 [2024-07-12 10:34:31.196347] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.325 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.326 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.326 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.326 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:56.326 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:56.326 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:56.326 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:56.326 10:34:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:57.261 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:57.262 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:57.262 10:34:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:57.262 00:07:57.262 real 0m1.491s 00:07:57.262 user 0m1.314s 00:07:57.262 sys 0m0.183s 00:07:57.262 10:34:32 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:57.262 10:34:32 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:57.262 ************************************ 00:07:57.262 END TEST accel_copy_crc32c_C2 00:07:57.262 ************************************ 00:07:57.520 10:34:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:57.520 10:34:32 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:57.520 10:34:32 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:57.520 10:34:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.520 10:34:32 accel -- common/autotest_common.sh@10 -- # set +x 00:07:57.520 ************************************ 00:07:57.520 START TEST accel_dualcast 00:07:57.520 ************************************ 00:07:57.520 10:34:32 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:57.520 10:34:32 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:57.520 [2024-07-12 10:34:32.546833] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:57.520 [2024-07-12 10:34:32.546896] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1989579 ] 00:07:57.520 [2024-07-12 10:34:32.675976] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.781 [2024-07-12 10:34:32.777310] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:57.781 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:57.782 10:34:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:59.160 10:34:34 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:59.160 00:07:59.160 real 0m1.508s 00:07:59.160 user 0m1.322s 00:07:59.160 sys 0m0.191s 00:07:59.160 10:34:34 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:59.160 10:34:34 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:59.160 ************************************ 00:07:59.160 END TEST accel_dualcast 00:07:59.160 ************************************ 00:07:59.160 10:34:34 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:59.160 10:34:34 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:59.160 10:34:34 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:59.160 10:34:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:59.160 10:34:34 accel -- common/autotest_common.sh@10 -- # set +x 00:07:59.160 ************************************ 00:07:59.160 START TEST accel_compare 00:07:59.160 ************************************ 00:07:59.160 10:34:34 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:59.160 10:34:34 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:59.160 [2024-07-12 10:34:34.120061] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:07:59.160 [2024-07-12 10:34:34.120119] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1989772 ] 00:07:59.160 [2024-07-12 10:34:34.239387] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.160 [2024-07-12 10:34:34.343810] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.419 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:59.420 10:34:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:08:00.798 10:34:35 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:00.798 00:08:00.798 real 0m1.496s 00:08:00.798 user 0m1.314s 00:08:00.798 sys 0m0.176s 00:08:00.798 10:34:35 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.798 10:34:35 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:08:00.798 ************************************ 00:08:00.798 END TEST accel_compare 00:08:00.798 ************************************ 00:08:00.798 10:34:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:00.798 10:34:35 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:00.798 10:34:35 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:00.798 10:34:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.798 10:34:35 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.798 ************************************ 00:08:00.798 START TEST accel_xor 00:08:00.798 ************************************ 00:08:00.798 10:34:35 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:00.798 [2024-07-12 10:34:35.684196] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:00.798 [2024-07-12 10:34:35.684254] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1990040 ] 00:08:00.798 [2024-07-12 10:34:35.802640] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.798 [2024-07-12 10:34:35.903261] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:00.798 10:34:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:02.174 00:08:02.174 real 0m1.501s 00:08:02.174 user 0m1.329s 00:08:02.174 sys 0m0.172s 00:08:02.174 10:34:37 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.174 10:34:37 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:02.174 ************************************ 00:08:02.174 END TEST accel_xor 00:08:02.174 ************************************ 00:08:02.174 10:34:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:02.174 10:34:37 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:02.174 10:34:37 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:02.174 10:34:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.174 10:34:37 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.174 ************************************ 00:08:02.174 START TEST accel_xor 00:08:02.174 ************************************ 00:08:02.174 10:34:37 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:02.174 10:34:37 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:02.174 [2024-07-12 10:34:37.258785] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:02.174 [2024-07-12 10:34:37.258841] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1990327 ] 00:08:02.433 [2024-07-12 10:34:37.386735] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.433 [2024-07-12 10:34:37.487098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:02.433 10:34:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:03.809 10:34:38 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:03.809 00:08:03.809 real 0m1.500s 00:08:03.809 user 0m1.316s 00:08:03.809 sys 0m0.186s 00:08:03.809 10:34:38 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.809 10:34:38 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:03.809 ************************************ 00:08:03.809 END TEST accel_xor 00:08:03.809 ************************************ 00:08:03.809 10:34:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.809 10:34:38 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:03.809 10:34:38 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:03.809 10:34:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.809 10:34:38 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.809 ************************************ 00:08:03.809 START TEST accel_dif_verify 00:08:03.809 ************************************ 00:08:03.809 10:34:38 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:03.809 10:34:38 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:08:03.809 [2024-07-12 10:34:38.825546] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:03.809 [2024-07-12 10:34:38.825605] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1990527 ] 00:08:03.809 [2024-07-12 10:34:38.953926] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.068 [2024-07-12 10:34:39.054412] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:04.068 10:34:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:05.444 10:34:40 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:05.444 00:08:05.444 real 0m1.497s 00:08:05.444 user 0m1.316s 00:08:05.444 sys 0m0.186s 00:08:05.444 10:34:40 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:05.444 10:34:40 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:08:05.444 ************************************ 00:08:05.444 END TEST accel_dif_verify 00:08:05.444 ************************************ 00:08:05.444 10:34:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:05.444 10:34:40 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:05.444 10:34:40 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:05.444 10:34:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.444 10:34:40 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.444 ************************************ 00:08:05.444 START TEST accel_dif_generate 00:08:05.444 ************************************ 00:08:05.444 10:34:40 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:05.444 10:34:40 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:08:05.444 [2024-07-12 10:34:40.412040] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:05.444 [2024-07-12 10:34:40.412098] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1990730 ] 00:08:05.444 [2024-07-12 10:34:40.540095] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.444 [2024-07-12 10:34:40.636610] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:05.703 10:34:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:07.081 10:34:41 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:07.081 00:08:07.081 real 0m1.486s 00:08:07.081 user 0m1.299s 00:08:07.081 sys 0m0.190s 00:08:07.081 10:34:41 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:07.081 10:34:41 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:07.081 ************************************ 00:08:07.081 END TEST accel_dif_generate 00:08:07.081 ************************************ 00:08:07.081 10:34:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:07.081 10:34:41 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:07.081 10:34:41 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:07.081 10:34:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.081 10:34:41 accel -- common/autotest_common.sh@10 -- # set +x 00:08:07.081 ************************************ 00:08:07.081 START TEST accel_dif_generate_copy 00:08:07.081 ************************************ 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:07.081 10:34:41 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:07.081 [2024-07-12 10:34:41.975248] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:07.081 [2024-07-12 10:34:41.975307] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1990925 ] 00:08:07.081 [2024-07-12 10:34:42.103555] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.081 [2024-07-12 10:34:42.204702] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.081 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.341 10:34:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:08.277 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:08.278 00:08:08.278 real 0m1.503s 00:08:08.278 user 0m1.309s 00:08:08.278 sys 0m0.200s 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:08.278 10:34:43 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:08.278 ************************************ 00:08:08.278 END TEST accel_dif_generate_copy 00:08:08.278 ************************************ 00:08:08.537 10:34:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:08.537 10:34:43 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:08.537 10:34:43 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.537 10:34:43 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:08.537 10:34:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:08.537 10:34:43 accel -- common/autotest_common.sh@10 -- # set +x 00:08:08.537 ************************************ 00:08:08.537 START TEST accel_comp 00:08:08.537 ************************************ 00:08:08.537 10:34:43 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:08.537 10:34:43 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:08.537 [2024-07-12 10:34:43.534400] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:08.537 [2024-07-12 10:34:43.534456] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1991124 ] 00:08:08.537 [2024-07-12 10:34:43.664467] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.797 [2024-07-12 10:34:43.766836] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:08.797 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.798 10:34:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:10.216 10:34:45 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:10.216 00:08:10.216 real 0m1.506s 00:08:10.216 user 0m1.313s 00:08:10.216 sys 0m0.194s 00:08:10.216 10:34:45 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.216 10:34:45 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:10.216 ************************************ 00:08:10.217 END TEST accel_comp 00:08:10.217 ************************************ 00:08:10.217 10:34:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:10.217 10:34:45 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:10.217 10:34:45 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:10.217 10:34:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.217 10:34:45 accel -- common/autotest_common.sh@10 -- # set +x 00:08:10.217 ************************************ 00:08:10.217 START TEST accel_decomp 00:08:10.217 ************************************ 00:08:10.217 10:34:45 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:10.217 10:34:45 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:10.217 [2024-07-12 10:34:45.127493] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:10.217 [2024-07-12 10:34:45.127553] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1991369 ] 00:08:10.217 [2024-07-12 10:34:45.246220] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.217 [2024-07-12 10:34:45.347169] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:10.475 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:10.476 10:34:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:11.411 10:34:46 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:11.411 00:08:11.411 real 0m1.505s 00:08:11.411 user 0m1.324s 00:08:11.411 sys 0m0.185s 00:08:11.411 10:34:46 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.411 10:34:46 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:11.411 ************************************ 00:08:11.411 END TEST accel_decomp 00:08:11.411 ************************************ 00:08:11.669 10:34:46 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:11.669 10:34:46 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.669 10:34:46 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:11.669 10:34:46 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.669 10:34:46 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.669 ************************************ 00:08:11.669 START TEST accel_decomp_full 00:08:11.669 ************************************ 00:08:11.669 10:34:46 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.669 10:34:46 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:11.669 10:34:46 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:11.669 10:34:46 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.669 10:34:46 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.669 10:34:46 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.669 10:34:46 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.669 10:34:46 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:11.670 10:34:46 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.670 10:34:46 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.670 10:34:46 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.670 10:34:46 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.670 10:34:46 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:11.670 10:34:46 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:11.670 10:34:46 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:11.670 [2024-07-12 10:34:46.710519] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:11.670 [2024-07-12 10:34:46.710578] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1991672 ] 00:08:11.670 [2024-07-12 10:34:46.836167] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.928 [2024-07-12 10:34:46.937037] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.928 10:34:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:13.302 10:34:48 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:13.302 00:08:13.302 real 0m1.518s 00:08:13.302 user 0m1.336s 00:08:13.302 sys 0m0.186s 00:08:13.302 10:34:48 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:13.302 10:34:48 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:13.302 ************************************ 00:08:13.302 END TEST accel_decomp_full 00:08:13.302 ************************************ 00:08:13.302 10:34:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:13.302 10:34:48 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:13.302 10:34:48 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:13.302 10:34:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.302 10:34:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.302 ************************************ 00:08:13.302 START TEST accel_decomp_mcore 00:08:13.302 ************************************ 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:13.302 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:13.302 [2024-07-12 10:34:48.279254] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:13.302 [2024-07-12 10:34:48.279310] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1991873 ] 00:08:13.302 [2024-07-12 10:34:48.407238] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:13.562 [2024-07-12 10:34:48.510585] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:13.562 [2024-07-12 10:34:48.510611] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:13.562 [2024-07-12 10:34:48.510692] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.562 [2024-07-12 10:34:48.510688] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.562 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:13.563 10:34:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:14.941 00:08:14.941 real 0m1.503s 00:08:14.941 user 0m4.739s 00:08:14.941 sys 0m0.202s 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:14.941 10:34:49 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:14.941 ************************************ 00:08:14.941 END TEST accel_decomp_mcore 00:08:14.941 ************************************ 00:08:14.941 10:34:49 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:14.941 10:34:49 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:14.941 10:34:49 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:14.941 10:34:49 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.941 10:34:49 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.941 ************************************ 00:08:14.941 START TEST accel_decomp_full_mcore 00:08:14.941 ************************************ 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:14.941 10:34:49 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:14.941 [2024-07-12 10:34:49.868131] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:14.941 [2024-07-12 10:34:49.868229] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1992077 ] 00:08:14.941 [2024-07-12 10:34:50.001314] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:14.941 [2024-07-12 10:34:50.111452] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:14.941 [2024-07-12 10:34:50.111541] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:14.941 [2024-07-12 10:34:50.111568] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:14.941 [2024-07-12 10:34:50.111572] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:15.202 10:34:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:16.582 10:34:51 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:16.582 00:08:16.583 real 0m1.555s 00:08:16.583 user 0m4.835s 00:08:16.583 sys 0m0.217s 00:08:16.583 10:34:51 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:16.583 10:34:51 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:16.583 ************************************ 00:08:16.583 END TEST accel_decomp_full_mcore 00:08:16.583 ************************************ 00:08:16.583 10:34:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:16.583 10:34:51 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:16.583 10:34:51 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:16.583 10:34:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.583 10:34:51 accel -- common/autotest_common.sh@10 -- # set +x 00:08:16.583 ************************************ 00:08:16.583 START TEST accel_decomp_mthread 00:08:16.583 ************************************ 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:16.583 [2024-07-12 10:34:51.463585] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:16.583 [2024-07-12 10:34:51.463624] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1992273 ] 00:08:16.583 [2024-07-12 10:34:51.574544] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.583 [2024-07-12 10:34:51.674806] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:16.583 10:34:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:17.961 00:08:17.961 real 0m1.480s 00:08:17.961 user 0m1.306s 00:08:17.961 sys 0m0.179s 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:17.961 10:34:52 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:17.961 ************************************ 00:08:17.961 END TEST accel_decomp_mthread 00:08:17.961 ************************************ 00:08:17.961 10:34:52 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:17.961 10:34:52 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:17.961 10:34:52 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:17.961 10:34:52 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.961 10:34:52 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.961 ************************************ 00:08:17.961 START TEST accel_decomp_full_mthread 00:08:17.961 ************************************ 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:17.961 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:17.961 [2024-07-12 10:34:53.037173] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:17.961 [2024-07-12 10:34:53.037231] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1992473 ] 00:08:18.220 [2024-07-12 10:34:53.167386] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.220 [2024-07-12 10:34:53.266751] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:18.220 10:34:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:19.599 00:08:19.599 real 0m1.534s 00:08:19.599 user 0m1.355s 00:08:19.599 sys 0m0.183s 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:19.599 10:34:54 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:19.599 ************************************ 00:08:19.599 END TEST accel_decomp_full_mthread 00:08:19.599 ************************************ 00:08:19.599 10:34:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:19.599 10:34:54 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:19.599 10:34:54 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:19.599 10:34:54 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:19.600 10:34:54 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:19.600 10:34:54 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1992726 00:08:19.600 10:34:54 accel -- accel/accel.sh@63 -- # waitforlisten 1992726 00:08:19.600 10:34:54 accel -- common/autotest_common.sh@829 -- # '[' -z 1992726 ']' 00:08:19.600 10:34:54 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:19.600 10:34:54 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:19.600 10:34:54 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:19.600 10:34:54 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:19.600 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:19.600 10:34:54 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:19.600 10:34:54 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:19.600 10:34:54 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:19.600 10:34:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:19.600 10:34:54 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:19.600 10:34:54 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.600 10:34:54 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.600 10:34:54 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:19.600 10:34:54 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:19.600 10:34:54 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:19.600 10:34:54 accel -- accel/accel.sh@41 -- # jq -r . 00:08:19.600 [2024-07-12 10:34:54.648100] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:19.600 [2024-07-12 10:34:54.648170] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1992726 ] 00:08:19.600 [2024-07-12 10:34:54.778950] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.859 [2024-07-12 10:34:54.886103] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.797 [2024-07-12 10:34:55.650761] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:20.797 10:34:55 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:20.797 10:34:55 accel -- common/autotest_common.sh@862 -- # return 0 00:08:20.797 10:34:55 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:20.797 10:34:55 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:20.797 10:34:55 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:20.797 10:34:55 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:20.797 10:34:55 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:20.797 10:34:55 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:20.797 10:34:55 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:20.797 10:34:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:20.797 10:34:55 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:20.797 10:34:55 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:21.057 10:34:55 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:21.057 "method": "compressdev_scan_accel_module", 00:08:21.057 10:34:55 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:21.057 10:34:55 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:21.057 10:34:55 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:21.057 10:34:55 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:21.057 10:34:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:21.057 10:34:56 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.057 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.057 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.057 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.058 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.058 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.058 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.058 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.058 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.058 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.058 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.058 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.058 10:34:56 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # IFS== 00:08:21.058 10:34:56 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:21.058 10:34:56 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:21.058 10:34:56 accel -- accel/accel.sh@75 -- # killprocess 1992726 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@948 -- # '[' -z 1992726 ']' 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@952 -- # kill -0 1992726 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@953 -- # uname 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1992726 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1992726' 00:08:21.058 killing process with pid 1992726 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@967 -- # kill 1992726 00:08:21.058 10:34:56 accel -- common/autotest_common.sh@972 -- # wait 1992726 00:08:21.318 10:34:56 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:21.318 10:34:56 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:21.318 10:34:56 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:21.318 10:34:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:21.318 10:34:56 accel -- common/autotest_common.sh@10 -- # set +x 00:08:21.578 ************************************ 00:08:21.578 START TEST accel_cdev_comp 00:08:21.578 ************************************ 00:08:21.578 10:34:56 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:21.578 10:34:56 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:21.578 [2024-07-12 10:34:56.576460] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:21.578 [2024-07-12 10:34:56.576531] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1993026 ] 00:08:21.578 [2024-07-12 10:34:56.708538] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.838 [2024-07-12 10:34:56.814838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.404 [2024-07-12 10:34:57.585758] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:22.404 [2024-07-12 10:34:57.588389] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21cb080 PMD being used: compress_qat 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.404 [2024-07-12 10:34:57.592527] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21cfe60 PMD being used: compress_qat 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.404 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:22.405 10:34:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:23.781 10:34:58 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:23.781 00:08:23.781 real 0m2.228s 00:08:23.781 user 0m0.010s 00:08:23.781 sys 0m0.004s 00:08:23.781 10:34:58 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:23.781 10:34:58 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:23.781 ************************************ 00:08:23.781 END TEST accel_cdev_comp 00:08:23.781 ************************************ 00:08:23.781 10:34:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:23.781 10:34:58 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:23.781 10:34:58 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:23.781 10:34:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.781 10:34:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:23.781 ************************************ 00:08:23.781 START TEST accel_cdev_decomp 00:08:23.781 ************************************ 00:08:23.781 10:34:58 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:23.781 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:23.781 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:23.781 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:23.781 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:23.781 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:23.782 10:34:58 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:23.782 [2024-07-12 10:34:58.878264] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:23.782 [2024-07-12 10:34:58.878324] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1993381 ] 00:08:24.051 [2024-07-12 10:34:59.007725] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.051 [2024-07-12 10:34:59.108351] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.988 [2024-07-12 10:34:59.870168] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:24.988 [2024-07-12 10:34:59.872722] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2135080 PMD being used: compress_qat 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.988 [2024-07-12 10:34:59.876880] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2139e60 PMD being used: compress_qat 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:24.988 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:24.989 10:34:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.977 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:25.978 00:08:25.978 real 0m2.196s 00:08:25.978 user 0m0.010s 00:08:25.978 sys 0m0.003s 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:25.978 10:35:01 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:25.978 ************************************ 00:08:25.978 END TEST accel_cdev_decomp 00:08:25.978 ************************************ 00:08:25.978 10:35:01 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:25.978 10:35:01 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:25.978 10:35:01 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:25.978 10:35:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:25.978 10:35:01 accel -- common/autotest_common.sh@10 -- # set +x 00:08:25.978 ************************************ 00:08:25.978 START TEST accel_cdev_decomp_full 00:08:25.978 ************************************ 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:25.978 10:35:01 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:25.978 [2024-07-12 10:35:01.139509] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:25.978 [2024-07-12 10:35:01.139568] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1993603 ] 00:08:26.237 [2024-07-12 10:35:01.267651] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.237 [2024-07-12 10:35:01.368345] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.174 [2024-07-12 10:35:02.138825] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:27.175 [2024-07-12 10:35:02.141498] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2473080 PMD being used: compress_qat 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.175 [2024-07-12 10:35:02.144840] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2472ce0 PMD being used: compress_qat 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.175 10:35:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.552 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:28.553 00:08:28.553 real 0m2.206s 00:08:28.553 user 0m0.011s 00:08:28.553 sys 0m0.002s 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:28.553 10:35:03 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:28.553 ************************************ 00:08:28.553 END TEST accel_cdev_decomp_full 00:08:28.553 ************************************ 00:08:28.553 10:35:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:28.553 10:35:03 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:28.553 10:35:03 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:28.553 10:35:03 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:28.553 10:35:03 accel -- common/autotest_common.sh@10 -- # set +x 00:08:28.553 ************************************ 00:08:28.553 START TEST accel_cdev_decomp_mcore 00:08:28.553 ************************************ 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:28.553 10:35:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:28.553 [2024-07-12 10:35:03.416370] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:28.553 [2024-07-12 10:35:03.416429] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1993964 ] 00:08:28.553 [2024-07-12 10:35:03.546100] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:28.553 [2024-07-12 10:35:03.648262] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:28.553 [2024-07-12 10:35:03.648348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:28.553 [2024-07-12 10:35:03.648424] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:28.553 [2024-07-12 10:35:03.648428] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.490 [2024-07-12 10:35:04.401594] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:29.490 [2024-07-12 10:35:04.404198] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xad9720 PMD being used: compress_qat 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:29.490 [2024-07-12 10:35:04.409828] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f641419b8b0 PMD being used: compress_qat 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 [2024-07-12 10:35:04.410594] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f640c19b8b0 PMD being used: compress_qat 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.490 [2024-07-12 10:35:04.411683] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xade9f0 PMD being used: compress_qat 00:08:29.490 [2024-07-12 10:35:04.411894] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f640419b8b0 PMD being used: compress_qat 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.490 10:35:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.425 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.426 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:30.426 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:30.426 10:35:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:30.426 00:08:30.426 real 0m2.216s 00:08:30.426 user 0m7.203s 00:08:30.426 sys 0m0.574s 00:08:30.426 10:35:05 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:30.426 10:35:05 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:30.426 ************************************ 00:08:30.426 END TEST accel_cdev_decomp_mcore 00:08:30.426 ************************************ 00:08:30.685 10:35:05 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:30.685 10:35:05 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:30.685 10:35:05 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:30.685 10:35:05 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.685 10:35:05 accel -- common/autotest_common.sh@10 -- # set +x 00:08:30.685 ************************************ 00:08:30.685 START TEST accel_cdev_decomp_full_mcore 00:08:30.685 ************************************ 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:30.685 10:35:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:30.685 [2024-07-12 10:35:05.711790] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:30.685 [2024-07-12 10:35:05.711848] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1994313 ] 00:08:30.685 [2024-07-12 10:35:05.840581] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:30.943 [2024-07-12 10:35:05.946236] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:30.943 [2024-07-12 10:35:05.946321] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:30.943 [2024-07-12 10:35:05.946403] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:30.943 [2024-07-12 10:35:05.946406] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.510 [2024-07-12 10:35:06.700010] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:31.510 [2024-07-12 10:35:06.702611] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xea9720 PMD being used: compress_qat 00:08:31.510 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.510 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.510 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.510 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 [2024-07-12 10:35:06.707325] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f3f4419b8b0 PMD being used: compress_qat 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 [2024-07-12 10:35:06.708074] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f3f3c19b8b0 PMD being used: compress_qat 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 [2024-07-12 10:35:06.709204] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xeaca30 PMD being used: compress_qat 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.769 [2024-07-12 10:35:06.709383] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f3f3419b8b0 PMD being used: compress_qat 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.769 10:35:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:32.713 00:08:32.713 real 0m2.215s 00:08:32.713 user 0m7.158s 00:08:32.713 sys 0m0.590s 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:32.713 10:35:07 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:32.713 ************************************ 00:08:32.713 END TEST accel_cdev_decomp_full_mcore 00:08:32.713 ************************************ 00:08:32.973 10:35:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:32.973 10:35:07 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:32.973 10:35:07 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:32.973 10:35:07 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:32.973 10:35:07 accel -- common/autotest_common.sh@10 -- # set +x 00:08:32.973 ************************************ 00:08:32.973 START TEST accel_cdev_decomp_mthread 00:08:32.973 ************************************ 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:32.973 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:32.974 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:32.974 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:32.974 10:35:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:32.974 [2024-07-12 10:35:07.994455] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:32.974 [2024-07-12 10:35:07.994519] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1994548 ] 00:08:32.974 [2024-07-12 10:35:08.124845] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.233 [2024-07-12 10:35:08.228010] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.801 [2024-07-12 10:35:08.993077] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:33.801 [2024-07-12 10:35:08.995644] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2139080 PMD being used: compress_qat 00:08:34.060 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.060 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.060 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.060 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.060 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.060 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.060 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.060 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.060 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.061 [2024-07-12 10:35:09.000472] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x213e2a0 PMD being used: compress_qat 00:08:34.061 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:34.061 10:35:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 [2024-07-12 10:35:09.002973] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22610f0 PMD being used: compress_qat 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.061 10:35:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:34.998 00:08:34.998 real 0m2.230s 00:08:34.998 user 0m1.646s 00:08:34.998 sys 0m0.584s 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.998 10:35:10 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:34.998 ************************************ 00:08:34.998 END TEST accel_cdev_decomp_mthread 00:08:34.998 ************************************ 00:08:35.255 10:35:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:35.255 10:35:10 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:35.255 10:35:10 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:35.255 10:35:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:35.255 10:35:10 accel -- common/autotest_common.sh@10 -- # set +x 00:08:35.255 ************************************ 00:08:35.255 START TEST accel_cdev_decomp_full_mthread 00:08:35.255 ************************************ 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:35.255 10:35:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:35.255 [2024-07-12 10:35:10.299981] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:35.256 [2024-07-12 10:35:10.300040] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1994911 ] 00:08:35.256 [2024-07-12 10:35:10.425962] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.513 [2024-07-12 10:35:10.525492] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.448 [2024-07-12 10:35:11.296326] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:36.448 [2024-07-12 10:35:11.298936] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c1c080 PMD being used: compress_qat 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.448 [2024-07-12 10:35:11.303111] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c1f3b0 PMD being used: compress_qat 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.448 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.449 [2024-07-12 10:35:11.306005] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1d43cc0 PMD being used: compress_qat 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.449 10:35:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:37.385 00:08:37.385 real 0m2.226s 00:08:37.385 user 0m1.658s 00:08:37.385 sys 0m0.571s 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:37.385 10:35:12 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:37.385 ************************************ 00:08:37.385 END TEST accel_cdev_decomp_full_mthread 00:08:37.385 ************************************ 00:08:37.385 10:35:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:37.385 10:35:12 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:37.385 10:35:12 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:37.385 10:35:12 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:37.385 10:35:12 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:37.385 10:35:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.385 10:35:12 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:37.385 10:35:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:37.385 10:35:12 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:37.385 10:35:12 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:37.385 10:35:12 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:37.385 10:35:12 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:37.385 10:35:12 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:37.385 10:35:12 accel -- accel/accel.sh@41 -- # jq -r . 00:08:37.385 ************************************ 00:08:37.385 START TEST accel_dif_functional_tests 00:08:37.385 ************************************ 00:08:37.385 10:35:12 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:37.644 [2024-07-12 10:35:12.630843] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:37.644 [2024-07-12 10:35:12.630902] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1995269 ] 00:08:37.644 [2024-07-12 10:35:12.758972] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:37.902 [2024-07-12 10:35:12.863368] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:37.902 [2024-07-12 10:35:12.863456] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:37.902 [2024-07-12 10:35:12.863460] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.902 00:08:37.902 00:08:37.902 CUnit - A unit testing framework for C - Version 2.1-3 00:08:37.902 http://cunit.sourceforge.net/ 00:08:37.902 00:08:37.902 00:08:37.902 Suite: accel_dif 00:08:37.902 Test: verify: DIF generated, GUARD check ...passed 00:08:37.902 Test: verify: DIF generated, APPTAG check ...passed 00:08:37.902 Test: verify: DIF generated, REFTAG check ...passed 00:08:37.902 Test: verify: DIF not generated, GUARD check ...[2024-07-12 10:35:12.950703] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:37.902 passed 00:08:37.902 Test: verify: DIF not generated, APPTAG check ...[2024-07-12 10:35:12.950773] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:37.902 passed 00:08:37.902 Test: verify: DIF not generated, REFTAG check ...[2024-07-12 10:35:12.950806] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:37.902 passed 00:08:37.902 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:37.902 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-12 10:35:12.950872] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:37.902 passed 00:08:37.902 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:37.902 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:37.902 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:37.902 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-12 10:35:12.951019] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:37.902 passed 00:08:37.902 Test: verify copy: DIF generated, GUARD check ...passed 00:08:37.902 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:37.902 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:37.902 Test: verify copy: DIF not generated, GUARD check ...[2024-07-12 10:35:12.951181] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:37.902 passed 00:08:37.902 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-12 10:35:12.951214] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:37.902 passed 00:08:37.902 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-12 10:35:12.951245] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:37.902 passed 00:08:37.902 Test: generate copy: DIF generated, GUARD check ...passed 00:08:37.902 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:37.902 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:37.902 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:37.902 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:37.902 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:37.902 Test: generate copy: iovecs-len validate ...[2024-07-12 10:35:12.951501] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:37.902 passed 00:08:37.902 Test: generate copy: buffer alignment validate ...passed 00:08:37.902 00:08:37.902 Run Summary: Type Total Ran Passed Failed Inactive 00:08:37.902 suites 1 1 n/a 0 0 00:08:37.902 tests 26 26 26 0 0 00:08:37.902 asserts 115 115 115 0 n/a 00:08:37.902 00:08:37.902 Elapsed time = 0.003 seconds 00:08:38.161 00:08:38.161 real 0m0.576s 00:08:38.161 user 0m0.753s 00:08:38.161 sys 0m0.215s 00:08:38.161 10:35:13 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:38.161 10:35:13 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:38.161 ************************************ 00:08:38.161 END TEST accel_dif_functional_tests 00:08:38.161 ************************************ 00:08:38.161 10:35:13 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:38.161 00:08:38.161 real 0m53.157s 00:08:38.161 user 1m1.487s 00:08:38.161 sys 0m11.675s 00:08:38.161 10:35:13 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:38.161 10:35:13 accel -- common/autotest_common.sh@10 -- # set +x 00:08:38.161 ************************************ 00:08:38.161 END TEST accel 00:08:38.161 ************************************ 00:08:38.161 10:35:13 -- common/autotest_common.sh@1142 -- # return 0 00:08:38.161 10:35:13 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:38.161 10:35:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:38.161 10:35:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:38.161 10:35:13 -- common/autotest_common.sh@10 -- # set +x 00:08:38.161 ************************************ 00:08:38.161 START TEST accel_rpc 00:08:38.161 ************************************ 00:08:38.161 10:35:13 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:38.161 * Looking for test storage... 00:08:38.418 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:38.418 10:35:13 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:38.419 10:35:13 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1995347 00:08:38.419 10:35:13 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1995347 00:08:38.419 10:35:13 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:38.419 10:35:13 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 1995347 ']' 00:08:38.419 10:35:13 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:38.419 10:35:13 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:38.419 10:35:13 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:38.419 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:38.419 10:35:13 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:38.419 10:35:13 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:38.419 [2024-07-12 10:35:13.436006] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:38.419 [2024-07-12 10:35:13.436079] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1995347 ] 00:08:38.419 [2024-07-12 10:35:13.565892] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.675 [2024-07-12 10:35:13.663899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.241 10:35:14 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:39.241 10:35:14 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:39.241 10:35:14 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:39.241 10:35:14 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:39.241 10:35:14 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:39.241 10:35:14 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:39.241 10:35:14 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:39.241 10:35:14 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:39.241 10:35:14 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.241 10:35:14 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:39.241 ************************************ 00:08:39.241 START TEST accel_assign_opcode 00:08:39.241 ************************************ 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:39.241 [2024-07-12 10:35:14.386185] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:39.241 [2024-07-12 10:35:14.398213] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.241 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:39.500 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.500 10:35:14 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:39.500 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.500 10:35:14 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:39.500 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:39.500 10:35:14 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:39.500 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.500 software 00:08:39.500 00:08:39.500 real 0m0.296s 00:08:39.500 user 0m0.052s 00:08:39.500 sys 0m0.012s 00:08:39.500 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.500 10:35:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:39.500 ************************************ 00:08:39.500 END TEST accel_assign_opcode 00:08:39.500 ************************************ 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:39.758 10:35:14 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1995347 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 1995347 ']' 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 1995347 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1995347 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1995347' 00:08:39.758 killing process with pid 1995347 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@967 -- # kill 1995347 00:08:39.758 10:35:14 accel_rpc -- common/autotest_common.sh@972 -- # wait 1995347 00:08:40.017 00:08:40.017 real 0m1.899s 00:08:40.017 user 0m1.981s 00:08:40.017 sys 0m0.568s 00:08:40.017 10:35:15 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:40.017 10:35:15 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:40.017 ************************************ 00:08:40.017 END TEST accel_rpc 00:08:40.017 ************************************ 00:08:40.017 10:35:15 -- common/autotest_common.sh@1142 -- # return 0 00:08:40.017 10:35:15 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:40.017 10:35:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:40.017 10:35:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:40.017 10:35:15 -- common/autotest_common.sh@10 -- # set +x 00:08:40.276 ************************************ 00:08:40.276 START TEST app_cmdline 00:08:40.276 ************************************ 00:08:40.276 10:35:15 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:40.276 * Looking for test storage... 00:08:40.276 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:40.276 10:35:15 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:40.276 10:35:15 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1995760 00:08:40.276 10:35:15 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1995760 00:08:40.276 10:35:15 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:40.276 10:35:15 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 1995760 ']' 00:08:40.276 10:35:15 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:40.276 10:35:15 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:40.276 10:35:15 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:40.276 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:40.276 10:35:15 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:40.276 10:35:15 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:40.276 [2024-07-12 10:35:15.420886] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:40.276 [2024-07-12 10:35:15.420961] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1995760 ] 00:08:40.535 [2024-07-12 10:35:15.552034] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.535 [2024-07-12 10:35:15.657302] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:41.500 { 00:08:41.500 "version": "SPDK v24.09-pre git sha1 b3936a144", 00:08:41.500 "fields": { 00:08:41.500 "major": 24, 00:08:41.500 "minor": 9, 00:08:41.500 "patch": 0, 00:08:41.500 "suffix": "-pre", 00:08:41.500 "commit": "b3936a144" 00:08:41.500 } 00:08:41.500 } 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:41.500 10:35:16 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:41.500 10:35:16 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:41.758 request: 00:08:41.758 { 00:08:41.758 "method": "env_dpdk_get_mem_stats", 00:08:41.758 "req_id": 1 00:08:41.758 } 00:08:41.758 Got JSON-RPC error response 00:08:41.758 response: 00:08:41.758 { 00:08:41.758 "code": -32601, 00:08:41.758 "message": "Method not found" 00:08:41.758 } 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:41.758 10:35:16 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1995760 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 1995760 ']' 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 1995760 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1995760 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1995760' 00:08:41.758 killing process with pid 1995760 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@967 -- # kill 1995760 00:08:41.758 10:35:16 app_cmdline -- common/autotest_common.sh@972 -- # wait 1995760 00:08:42.017 00:08:42.017 real 0m1.928s 00:08:42.017 user 0m2.308s 00:08:42.017 sys 0m0.555s 00:08:42.017 10:35:17 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:42.017 10:35:17 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:42.017 ************************************ 00:08:42.017 END TEST app_cmdline 00:08:42.017 ************************************ 00:08:42.277 10:35:17 -- common/autotest_common.sh@1142 -- # return 0 00:08:42.277 10:35:17 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:42.277 10:35:17 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:42.277 10:35:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:42.277 10:35:17 -- common/autotest_common.sh@10 -- # set +x 00:08:42.277 ************************************ 00:08:42.277 START TEST version 00:08:42.277 ************************************ 00:08:42.277 10:35:17 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:42.277 * Looking for test storage... 00:08:42.277 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:42.277 10:35:17 version -- app/version.sh@17 -- # get_header_version major 00:08:42.277 10:35:17 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:42.277 10:35:17 version -- app/version.sh@14 -- # cut -f2 00:08:42.277 10:35:17 version -- app/version.sh@14 -- # tr -d '"' 00:08:42.277 10:35:17 version -- app/version.sh@17 -- # major=24 00:08:42.277 10:35:17 version -- app/version.sh@18 -- # get_header_version minor 00:08:42.277 10:35:17 version -- app/version.sh@14 -- # cut -f2 00:08:42.277 10:35:17 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:42.277 10:35:17 version -- app/version.sh@14 -- # tr -d '"' 00:08:42.277 10:35:17 version -- app/version.sh@18 -- # minor=9 00:08:42.277 10:35:17 version -- app/version.sh@19 -- # get_header_version patch 00:08:42.277 10:35:17 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:42.277 10:35:17 version -- app/version.sh@14 -- # cut -f2 00:08:42.277 10:35:17 version -- app/version.sh@14 -- # tr -d '"' 00:08:42.277 10:35:17 version -- app/version.sh@19 -- # patch=0 00:08:42.277 10:35:17 version -- app/version.sh@20 -- # get_header_version suffix 00:08:42.277 10:35:17 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:42.277 10:35:17 version -- app/version.sh@14 -- # cut -f2 00:08:42.277 10:35:17 version -- app/version.sh@14 -- # tr -d '"' 00:08:42.277 10:35:17 version -- app/version.sh@20 -- # suffix=-pre 00:08:42.277 10:35:17 version -- app/version.sh@22 -- # version=24.9 00:08:42.277 10:35:17 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:42.277 10:35:17 version -- app/version.sh@28 -- # version=24.9rc0 00:08:42.277 10:35:17 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:42.277 10:35:17 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:42.277 10:35:17 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:42.277 10:35:17 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:42.277 00:08:42.277 real 0m0.185s 00:08:42.277 user 0m0.097s 00:08:42.277 sys 0m0.135s 00:08:42.277 10:35:17 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:42.277 10:35:17 version -- common/autotest_common.sh@10 -- # set +x 00:08:42.277 ************************************ 00:08:42.277 END TEST version 00:08:42.277 ************************************ 00:08:42.536 10:35:17 -- common/autotest_common.sh@1142 -- # return 0 00:08:42.536 10:35:17 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:42.536 10:35:17 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:42.536 10:35:17 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:42.536 10:35:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:42.536 10:35:17 -- common/autotest_common.sh@10 -- # set +x 00:08:42.536 ************************************ 00:08:42.536 START TEST blockdev_general 00:08:42.536 ************************************ 00:08:42.536 10:35:17 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:42.536 * Looking for test storage... 00:08:42.536 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:42.536 10:35:17 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1996072 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:42.536 10:35:17 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1996072 00:08:42.536 10:35:17 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 1996072 ']' 00:08:42.536 10:35:17 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:42.536 10:35:17 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:42.536 10:35:17 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:42.536 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:42.536 10:35:17 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:42.536 10:35:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:42.536 [2024-07-12 10:35:17.695840] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:42.536 [2024-07-12 10:35:17.695917] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1996072 ] 00:08:42.794 [2024-07-12 10:35:17.824872] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.794 [2024-07-12 10:35:17.922719] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.359 10:35:18 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:43.359 10:35:18 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:08:43.359 10:35:18 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:43.360 10:35:18 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:43.360 10:35:18 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:43.360 10:35:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.360 10:35:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:43.618 [2024-07-12 10:35:18.786797] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:43.618 [2024-07-12 10:35:18.786850] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:43.618 00:08:43.618 [2024-07-12 10:35:18.794775] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:43.618 [2024-07-12 10:35:18.794800] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:43.618 00:08:43.876 Malloc0 00:08:43.876 Malloc1 00:08:43.876 Malloc2 00:08:43.876 Malloc3 00:08:43.876 Malloc4 00:08:43.876 Malloc5 00:08:43.876 Malloc6 00:08:43.876 Malloc7 00:08:43.876 Malloc8 00:08:43.876 Malloc9 00:08:43.876 [2024-07-12 10:35:18.943523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:43.876 [2024-07-12 10:35:18.943572] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:43.876 [2024-07-12 10:35:18.943593] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x108f350 00:08:43.876 [2024-07-12 10:35:18.943606] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:43.876 [2024-07-12 10:35:18.944952] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:43.876 [2024-07-12 10:35:18.944980] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:43.876 TestPT 00:08:43.876 10:35:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.876 10:35:18 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:43.876 5000+0 records in 00:08:43.876 5000+0 records out 00:08:43.876 10240000 bytes (10 MB, 9.8 MiB) copied, 0.017174 s, 596 MB/s 00:08:43.876 10:35:19 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:43.876 AIO0 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.876 10:35:19 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.876 10:35:19 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:43.876 10:35:19 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.876 10:35:19 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.876 10:35:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:44.134 10:35:19 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:44.134 10:35:19 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:44.134 10:35:19 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:44.134 10:35:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:44.134 10:35:19 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:44.134 10:35:19 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:44.134 10:35:19 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:44.134 10:35:19 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:44.134 10:35:19 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:44.134 10:35:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:44.394 10:35:19 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:44.394 10:35:19 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:44.395 10:35:19 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "15b59281-4c35-4933-91ef-16eafc81eef6"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "15b59281-4c35-4933-91ef-16eafc81eef6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "ceb39c66-5201-5b78-bb6c-c69e2ce302b7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ceb39c66-5201-5b78-bb6c-c69e2ce302b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "ac07412a-683c-54a4-98ad-26223bdf550d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ac07412a-683c-54a4-98ad-26223bdf550d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "f98efac8-633a-5712-a160-080060a8ce8e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f98efac8-633a-5712-a160-080060a8ce8e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "32c99dfb-c543-50f1-a1d9-7aa25dfbbe99"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "32c99dfb-c543-50f1-a1d9-7aa25dfbbe99",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "74ae2a83-88e4-57f9-8ecf-1132ad4a3954"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "74ae2a83-88e4-57f9-8ecf-1132ad4a3954",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "004a9bce-3f9c-5a71-89e7-5f4eb7fd93ce"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "004a9bce-3f9c-5a71-89e7-5f4eb7fd93ce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "085fe99a-e648-54f6-9146-c2b09eede404"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "085fe99a-e648-54f6-9146-c2b09eede404",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "3fc270a7-627c-5e5d-85cd-efbe5d15dbf9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3fc270a7-627c-5e5d-85cd-efbe5d15dbf9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "0b8ba2b9-3a31-5231-962a-c22cb63135ea"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0b8ba2b9-3a31-5231-962a-c22cb63135ea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "159ca879-9e95-5af0-9b32-e1e1758788d9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "159ca879-9e95-5af0-9b32-e1e1758788d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "925ef954-63bb-58cb-9b3c-34f28223120a"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "925ef954-63bb-58cb-9b3c-34f28223120a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "7fdf3dc8-9001-409f-9dae-db12d81478ba"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "7fdf3dc8-9001-409f-9dae-db12d81478ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "7fdf3dc8-9001-409f-9dae-db12d81478ba",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "2c4bc5a6-be06-485f-a9bd-38830e948838",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "03f4126c-51a7-47d0-a8f8-a911331bffc2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "67398443-3c22-4366-9d66-e2669cdd8da8"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "67398443-3c22-4366-9d66-e2669cdd8da8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "67398443-3c22-4366-9d66-e2669cdd8da8",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "ad700643-ed6a-4847-a5bf-6654bd62a8c0",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a771bc09-fff9-4de1-b2ba-603656ae7491",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "df8c5d96-84fe-44b0-b173-47a03edfdcdb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "df8c5d96-84fe-44b0-b173-47a03edfdcdb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "df8c5d96-84fe-44b0-b173-47a03edfdcdb",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "69856429-6b73-4cee-802c-51ee6d902da3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "119ff60d-10e8-4666-af59-f9a5bab65d2c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "93ca69a0-37c2-4c21-8dfe-0d47ea6d499d"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "93ca69a0-37c2-4c21-8dfe-0d47ea6d499d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:44.395 10:35:19 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:44.395 10:35:19 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:44.395 10:35:19 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:44.395 10:35:19 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:44.395 10:35:19 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 1996072 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 1996072 ']' 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 1996072 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1996072 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1996072' 00:08:44.395 killing process with pid 1996072 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@967 -- # kill 1996072 00:08:44.395 10:35:19 blockdev_general -- common/autotest_common.sh@972 -- # wait 1996072 00:08:44.960 10:35:19 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:44.961 10:35:19 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:44.961 10:35:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:44.961 10:35:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.961 10:35:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:44.961 ************************************ 00:08:44.961 START TEST bdev_hello_world 00:08:44.961 ************************************ 00:08:44.961 10:35:19 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:44.961 [2024-07-12 10:35:20.019363] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:44.961 [2024-07-12 10:35:20.019404] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1996441 ] 00:08:44.961 [2024-07-12 10:35:20.132962] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.219 [2024-07-12 10:35:20.235779] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.219 [2024-07-12 10:35:20.395716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:45.219 [2024-07-12 10:35:20.395780] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:45.219 [2024-07-12 10:35:20.395795] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:45.219 [2024-07-12 10:35:20.403721] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:45.219 [2024-07-12 10:35:20.403747] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:45.219 [2024-07-12 10:35:20.411733] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:45.219 [2024-07-12 10:35:20.411756] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:45.476 [2024-07-12 10:35:20.489103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:45.476 [2024-07-12 10:35:20.489152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:45.476 [2024-07-12 10:35:20.489170] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfac3c0 00:08:45.477 [2024-07-12 10:35:20.489183] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:45.477 [2024-07-12 10:35:20.490624] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:45.477 [2024-07-12 10:35:20.490652] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:45.477 [2024-07-12 10:35:20.646920] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:45.477 [2024-07-12 10:35:20.646990] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:45.477 [2024-07-12 10:35:20.647047] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:45.477 [2024-07-12 10:35:20.647123] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:45.477 [2024-07-12 10:35:20.647198] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:45.477 [2024-07-12 10:35:20.647227] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:45.477 [2024-07-12 10:35:20.647291] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:45.477 00:08:45.477 [2024-07-12 10:35:20.647330] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:46.043 00:08:46.043 real 0m0.998s 00:08:46.043 user 0m0.659s 00:08:46.043 sys 0m0.292s 00:08:46.043 10:35:20 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:46.043 10:35:20 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:46.043 ************************************ 00:08:46.043 END TEST bdev_hello_world 00:08:46.043 ************************************ 00:08:46.043 10:35:21 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:46.043 10:35:21 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:46.043 10:35:21 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:46.043 10:35:21 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:46.043 10:35:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:46.043 ************************************ 00:08:46.043 START TEST bdev_bounds 00:08:46.043 ************************************ 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1996634 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1996634' 00:08:46.043 Process bdevio pid: 1996634 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1996634 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1996634 ']' 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:46.043 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:46.043 10:35:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:46.043 [2024-07-12 10:35:21.112924] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:46.043 [2024-07-12 10:35:21.112985] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1996634 ] 00:08:46.043 [2024-07-12 10:35:21.234306] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:46.301 [2024-07-12 10:35:21.345330] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:46.301 [2024-07-12 10:35:21.345415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:46.301 [2024-07-12 10:35:21.345419] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.559 [2024-07-12 10:35:21.502158] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:46.560 [2024-07-12 10:35:21.502205] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:46.560 [2024-07-12 10:35:21.502220] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:46.560 [2024-07-12 10:35:21.510167] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:46.560 [2024-07-12 10:35:21.510193] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:46.560 [2024-07-12 10:35:21.518183] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:46.560 [2024-07-12 10:35:21.518207] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:46.560 [2024-07-12 10:35:21.595336] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:46.560 [2024-07-12 10:35:21.595384] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:46.560 [2024-07-12 10:35:21.595403] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23dd0c0 00:08:46.560 [2024-07-12 10:35:21.595415] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:46.560 [2024-07-12 10:35:21.596919] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:46.560 [2024-07-12 10:35:21.596948] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:47.127 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:47.127 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:08:47.127 10:35:22 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:47.127 I/O targets: 00:08:47.127 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:47.127 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:47.127 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:47.127 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:47.127 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:47.127 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:47.127 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:47.127 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:47.127 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:47.127 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:47.127 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:47.127 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:47.127 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:47.127 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:47.127 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:47.127 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:47.127 00:08:47.127 00:08:47.127 CUnit - A unit testing framework for C - Version 2.1-3 00:08:47.127 http://cunit.sourceforge.net/ 00:08:47.127 00:08:47.127 00:08:47.127 Suite: bdevio tests on: AIO0 00:08:47.127 Test: blockdev write read block ...passed 00:08:47.127 Test: blockdev write zeroes read block ...passed 00:08:47.127 Test: blockdev write zeroes read no split ...passed 00:08:47.128 Test: blockdev write zeroes read split ...passed 00:08:47.128 Test: blockdev write zeroes read split partial ...passed 00:08:47.128 Test: blockdev reset ...passed 00:08:47.128 Test: blockdev write read 8 blocks ...passed 00:08:47.128 Test: blockdev write read size > 128k ...passed 00:08:47.128 Test: blockdev write read invalid size ...passed 00:08:47.128 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.128 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.128 Test: blockdev write read max offset ...passed 00:08:47.128 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.128 Test: blockdev writev readv 8 blocks ...passed 00:08:47.128 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.128 Test: blockdev writev readv block ...passed 00:08:47.128 Test: blockdev writev readv size > 128k ...passed 00:08:47.128 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.128 Test: blockdev comparev and writev ...passed 00:08:47.128 Test: blockdev nvme passthru rw ...passed 00:08:47.128 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.128 Test: blockdev nvme admin passthru ...passed 00:08:47.128 Test: blockdev copy ...passed 00:08:47.128 Suite: bdevio tests on: raid1 00:08:47.128 Test: blockdev write read block ...passed 00:08:47.128 Test: blockdev write zeroes read block ...passed 00:08:47.128 Test: blockdev write zeroes read no split ...passed 00:08:47.128 Test: blockdev write zeroes read split ...passed 00:08:47.128 Test: blockdev write zeroes read split partial ...passed 00:08:47.128 Test: blockdev reset ...passed 00:08:47.128 Test: blockdev write read 8 blocks ...passed 00:08:47.128 Test: blockdev write read size > 128k ...passed 00:08:47.128 Test: blockdev write read invalid size ...passed 00:08:47.128 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.128 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.128 Test: blockdev write read max offset ...passed 00:08:47.128 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.128 Test: blockdev writev readv 8 blocks ...passed 00:08:47.128 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.128 Test: blockdev writev readv block ...passed 00:08:47.128 Test: blockdev writev readv size > 128k ...passed 00:08:47.128 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.128 Test: blockdev comparev and writev ...passed 00:08:47.128 Test: blockdev nvme passthru rw ...passed 00:08:47.128 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.128 Test: blockdev nvme admin passthru ...passed 00:08:47.128 Test: blockdev copy ...passed 00:08:47.128 Suite: bdevio tests on: concat0 00:08:47.128 Test: blockdev write read block ...passed 00:08:47.128 Test: blockdev write zeroes read block ...passed 00:08:47.128 Test: blockdev write zeroes read no split ...passed 00:08:47.128 Test: blockdev write zeroes read split ...passed 00:08:47.128 Test: blockdev write zeroes read split partial ...passed 00:08:47.128 Test: blockdev reset ...passed 00:08:47.128 Test: blockdev write read 8 blocks ...passed 00:08:47.128 Test: blockdev write read size > 128k ...passed 00:08:47.128 Test: blockdev write read invalid size ...passed 00:08:47.128 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.128 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.128 Test: blockdev write read max offset ...passed 00:08:47.128 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.128 Test: blockdev writev readv 8 blocks ...passed 00:08:47.128 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.128 Test: blockdev writev readv block ...passed 00:08:47.128 Test: blockdev writev readv size > 128k ...passed 00:08:47.128 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.128 Test: blockdev comparev and writev ...passed 00:08:47.128 Test: blockdev nvme passthru rw ...passed 00:08:47.128 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.128 Test: blockdev nvme admin passthru ...passed 00:08:47.128 Test: blockdev copy ...passed 00:08:47.128 Suite: bdevio tests on: raid0 00:08:47.128 Test: blockdev write read block ...passed 00:08:47.128 Test: blockdev write zeroes read block ...passed 00:08:47.128 Test: blockdev write zeroes read no split ...passed 00:08:47.128 Test: blockdev write zeroes read split ...passed 00:08:47.128 Test: blockdev write zeroes read split partial ...passed 00:08:47.128 Test: blockdev reset ...passed 00:08:47.128 Test: blockdev write read 8 blocks ...passed 00:08:47.128 Test: blockdev write read size > 128k ...passed 00:08:47.128 Test: blockdev write read invalid size ...passed 00:08:47.128 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.128 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.128 Test: blockdev write read max offset ...passed 00:08:47.128 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.128 Test: blockdev writev readv 8 blocks ...passed 00:08:47.128 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.128 Test: blockdev writev readv block ...passed 00:08:47.128 Test: blockdev writev readv size > 128k ...passed 00:08:47.128 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.128 Test: blockdev comparev and writev ...passed 00:08:47.128 Test: blockdev nvme passthru rw ...passed 00:08:47.128 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.128 Test: blockdev nvme admin passthru ...passed 00:08:47.128 Test: blockdev copy ...passed 00:08:47.128 Suite: bdevio tests on: TestPT 00:08:47.128 Test: blockdev write read block ...passed 00:08:47.128 Test: blockdev write zeroes read block ...passed 00:08:47.128 Test: blockdev write zeroes read no split ...passed 00:08:47.128 Test: blockdev write zeroes read split ...passed 00:08:47.128 Test: blockdev write zeroes read split partial ...passed 00:08:47.128 Test: blockdev reset ...passed 00:08:47.128 Test: blockdev write read 8 blocks ...passed 00:08:47.128 Test: blockdev write read size > 128k ...passed 00:08:47.128 Test: blockdev write read invalid size ...passed 00:08:47.128 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.128 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.128 Test: blockdev write read max offset ...passed 00:08:47.128 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.128 Test: blockdev writev readv 8 blocks ...passed 00:08:47.128 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.128 Test: blockdev writev readv block ...passed 00:08:47.128 Test: blockdev writev readv size > 128k ...passed 00:08:47.128 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.128 Test: blockdev comparev and writev ...passed 00:08:47.128 Test: blockdev nvme passthru rw ...passed 00:08:47.128 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.128 Test: blockdev nvme admin passthru ...passed 00:08:47.128 Test: blockdev copy ...passed 00:08:47.128 Suite: bdevio tests on: Malloc2p7 00:08:47.128 Test: blockdev write read block ...passed 00:08:47.128 Test: blockdev write zeroes read block ...passed 00:08:47.128 Test: blockdev write zeroes read no split ...passed 00:08:47.128 Test: blockdev write zeroes read split ...passed 00:08:47.128 Test: blockdev write zeroes read split partial ...passed 00:08:47.128 Test: blockdev reset ...passed 00:08:47.128 Test: blockdev write read 8 blocks ...passed 00:08:47.128 Test: blockdev write read size > 128k ...passed 00:08:47.128 Test: blockdev write read invalid size ...passed 00:08:47.128 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.128 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.128 Test: blockdev write read max offset ...passed 00:08:47.128 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.128 Test: blockdev writev readv 8 blocks ...passed 00:08:47.128 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.128 Test: blockdev writev readv block ...passed 00:08:47.128 Test: blockdev writev readv size > 128k ...passed 00:08:47.128 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.128 Test: blockdev comparev and writev ...passed 00:08:47.128 Test: blockdev nvme passthru rw ...passed 00:08:47.128 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.128 Test: blockdev nvme admin passthru ...passed 00:08:47.129 Test: blockdev copy ...passed 00:08:47.129 Suite: bdevio tests on: Malloc2p6 00:08:47.129 Test: blockdev write read block ...passed 00:08:47.129 Test: blockdev write zeroes read block ...passed 00:08:47.129 Test: blockdev write zeroes read no split ...passed 00:08:47.129 Test: blockdev write zeroes read split ...passed 00:08:47.129 Test: blockdev write zeroes read split partial ...passed 00:08:47.129 Test: blockdev reset ...passed 00:08:47.129 Test: blockdev write read 8 blocks ...passed 00:08:47.129 Test: blockdev write read size > 128k ...passed 00:08:47.129 Test: blockdev write read invalid size ...passed 00:08:47.129 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.129 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.129 Test: blockdev write read max offset ...passed 00:08:47.129 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.129 Test: blockdev writev readv 8 blocks ...passed 00:08:47.129 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.129 Test: blockdev writev readv block ...passed 00:08:47.129 Test: blockdev writev readv size > 128k ...passed 00:08:47.129 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.129 Test: blockdev comparev and writev ...passed 00:08:47.129 Test: blockdev nvme passthru rw ...passed 00:08:47.129 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.129 Test: blockdev nvme admin passthru ...passed 00:08:47.129 Test: blockdev copy ...passed 00:08:47.129 Suite: bdevio tests on: Malloc2p5 00:08:47.129 Test: blockdev write read block ...passed 00:08:47.129 Test: blockdev write zeroes read block ...passed 00:08:47.129 Test: blockdev write zeroes read no split ...passed 00:08:47.129 Test: blockdev write zeroes read split ...passed 00:08:47.129 Test: blockdev write zeroes read split partial ...passed 00:08:47.129 Test: blockdev reset ...passed 00:08:47.129 Test: blockdev write read 8 blocks ...passed 00:08:47.129 Test: blockdev write read size > 128k ...passed 00:08:47.129 Test: blockdev write read invalid size ...passed 00:08:47.129 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.129 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.129 Test: blockdev write read max offset ...passed 00:08:47.129 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.129 Test: blockdev writev readv 8 blocks ...passed 00:08:47.129 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.129 Test: blockdev writev readv block ...passed 00:08:47.129 Test: blockdev writev readv size > 128k ...passed 00:08:47.129 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.129 Test: blockdev comparev and writev ...passed 00:08:47.129 Test: blockdev nvme passthru rw ...passed 00:08:47.129 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.129 Test: blockdev nvme admin passthru ...passed 00:08:47.129 Test: blockdev copy ...passed 00:08:47.129 Suite: bdevio tests on: Malloc2p4 00:08:47.129 Test: blockdev write read block ...passed 00:08:47.129 Test: blockdev write zeroes read block ...passed 00:08:47.129 Test: blockdev write zeroes read no split ...passed 00:08:47.129 Test: blockdev write zeroes read split ...passed 00:08:47.388 Test: blockdev write zeroes read split partial ...passed 00:08:47.388 Test: blockdev reset ...passed 00:08:47.388 Test: blockdev write read 8 blocks ...passed 00:08:47.388 Test: blockdev write read size > 128k ...passed 00:08:47.388 Test: blockdev write read invalid size ...passed 00:08:47.388 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.388 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.388 Test: blockdev write read max offset ...passed 00:08:47.388 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.388 Test: blockdev writev readv 8 blocks ...passed 00:08:47.388 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.389 Test: blockdev writev readv block ...passed 00:08:47.389 Test: blockdev writev readv size > 128k ...passed 00:08:47.389 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.389 Test: blockdev comparev and writev ...passed 00:08:47.389 Test: blockdev nvme passthru rw ...passed 00:08:47.389 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.389 Test: blockdev nvme admin passthru ...passed 00:08:47.389 Test: blockdev copy ...passed 00:08:47.389 Suite: bdevio tests on: Malloc2p3 00:08:47.389 Test: blockdev write read block ...passed 00:08:47.389 Test: blockdev write zeroes read block ...passed 00:08:47.389 Test: blockdev write zeroes read no split ...passed 00:08:47.389 Test: blockdev write zeroes read split ...passed 00:08:47.389 Test: blockdev write zeroes read split partial ...passed 00:08:47.389 Test: blockdev reset ...passed 00:08:47.389 Test: blockdev write read 8 blocks ...passed 00:08:47.389 Test: blockdev write read size > 128k ...passed 00:08:47.389 Test: blockdev write read invalid size ...passed 00:08:47.389 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.389 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.389 Test: blockdev write read max offset ...passed 00:08:47.389 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.389 Test: blockdev writev readv 8 blocks ...passed 00:08:47.389 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.389 Test: blockdev writev readv block ...passed 00:08:47.389 Test: blockdev writev readv size > 128k ...passed 00:08:47.389 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.389 Test: blockdev comparev and writev ...passed 00:08:47.389 Test: blockdev nvme passthru rw ...passed 00:08:47.389 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.389 Test: blockdev nvme admin passthru ...passed 00:08:47.389 Test: blockdev copy ...passed 00:08:47.389 Suite: bdevio tests on: Malloc2p2 00:08:47.389 Test: blockdev write read block ...passed 00:08:47.389 Test: blockdev write zeroes read block ...passed 00:08:47.389 Test: blockdev write zeroes read no split ...passed 00:08:47.389 Test: blockdev write zeroes read split ...passed 00:08:47.389 Test: blockdev write zeroes read split partial ...passed 00:08:47.389 Test: blockdev reset ...passed 00:08:47.389 Test: blockdev write read 8 blocks ...passed 00:08:47.389 Test: blockdev write read size > 128k ...passed 00:08:47.389 Test: blockdev write read invalid size ...passed 00:08:47.389 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.389 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.389 Test: blockdev write read max offset ...passed 00:08:47.389 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.389 Test: blockdev writev readv 8 blocks ...passed 00:08:47.389 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.389 Test: blockdev writev readv block ...passed 00:08:47.389 Test: blockdev writev readv size > 128k ...passed 00:08:47.389 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.389 Test: blockdev comparev and writev ...passed 00:08:47.389 Test: blockdev nvme passthru rw ...passed 00:08:47.389 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.389 Test: blockdev nvme admin passthru ...passed 00:08:47.389 Test: blockdev copy ...passed 00:08:47.389 Suite: bdevio tests on: Malloc2p1 00:08:47.389 Test: blockdev write read block ...passed 00:08:47.389 Test: blockdev write zeroes read block ...passed 00:08:47.389 Test: blockdev write zeroes read no split ...passed 00:08:47.389 Test: blockdev write zeroes read split ...passed 00:08:47.389 Test: blockdev write zeroes read split partial ...passed 00:08:47.389 Test: blockdev reset ...passed 00:08:47.389 Test: blockdev write read 8 blocks ...passed 00:08:47.389 Test: blockdev write read size > 128k ...passed 00:08:47.389 Test: blockdev write read invalid size ...passed 00:08:47.389 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.389 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.389 Test: blockdev write read max offset ...passed 00:08:47.389 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.389 Test: blockdev writev readv 8 blocks ...passed 00:08:47.389 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.389 Test: blockdev writev readv block ...passed 00:08:47.389 Test: blockdev writev readv size > 128k ...passed 00:08:47.389 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.389 Test: blockdev comparev and writev ...passed 00:08:47.389 Test: blockdev nvme passthru rw ...passed 00:08:47.389 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.389 Test: blockdev nvme admin passthru ...passed 00:08:47.389 Test: blockdev copy ...passed 00:08:47.389 Suite: bdevio tests on: Malloc2p0 00:08:47.389 Test: blockdev write read block ...passed 00:08:47.389 Test: blockdev write zeroes read block ...passed 00:08:47.389 Test: blockdev write zeroes read no split ...passed 00:08:47.389 Test: blockdev write zeroes read split ...passed 00:08:47.389 Test: blockdev write zeroes read split partial ...passed 00:08:47.389 Test: blockdev reset ...passed 00:08:47.389 Test: blockdev write read 8 blocks ...passed 00:08:47.389 Test: blockdev write read size > 128k ...passed 00:08:47.389 Test: blockdev write read invalid size ...passed 00:08:47.389 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.389 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.389 Test: blockdev write read max offset ...passed 00:08:47.389 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.389 Test: blockdev writev readv 8 blocks ...passed 00:08:47.389 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.389 Test: blockdev writev readv block ...passed 00:08:47.389 Test: blockdev writev readv size > 128k ...passed 00:08:47.389 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.389 Test: blockdev comparev and writev ...passed 00:08:47.389 Test: blockdev nvme passthru rw ...passed 00:08:47.389 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.389 Test: blockdev nvme admin passthru ...passed 00:08:47.389 Test: blockdev copy ...passed 00:08:47.389 Suite: bdevio tests on: Malloc1p1 00:08:47.389 Test: blockdev write read block ...passed 00:08:47.389 Test: blockdev write zeroes read block ...passed 00:08:47.389 Test: blockdev write zeroes read no split ...passed 00:08:47.389 Test: blockdev write zeroes read split ...passed 00:08:47.389 Test: blockdev write zeroes read split partial ...passed 00:08:47.389 Test: blockdev reset ...passed 00:08:47.389 Test: blockdev write read 8 blocks ...passed 00:08:47.389 Test: blockdev write read size > 128k ...passed 00:08:47.389 Test: blockdev write read invalid size ...passed 00:08:47.389 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.389 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.389 Test: blockdev write read max offset ...passed 00:08:47.389 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.389 Test: blockdev writev readv 8 blocks ...passed 00:08:47.389 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.389 Test: blockdev writev readv block ...passed 00:08:47.389 Test: blockdev writev readv size > 128k ...passed 00:08:47.389 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.389 Test: blockdev comparev and writev ...passed 00:08:47.389 Test: blockdev nvme passthru rw ...passed 00:08:47.389 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.389 Test: blockdev nvme admin passthru ...passed 00:08:47.389 Test: blockdev copy ...passed 00:08:47.389 Suite: bdevio tests on: Malloc1p0 00:08:47.389 Test: blockdev write read block ...passed 00:08:47.389 Test: blockdev write zeroes read block ...passed 00:08:47.389 Test: blockdev write zeroes read no split ...passed 00:08:47.389 Test: blockdev write zeroes read split ...passed 00:08:47.389 Test: blockdev write zeroes read split partial ...passed 00:08:47.389 Test: blockdev reset ...passed 00:08:47.389 Test: blockdev write read 8 blocks ...passed 00:08:47.389 Test: blockdev write read size > 128k ...passed 00:08:47.389 Test: blockdev write read invalid size ...passed 00:08:47.389 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.389 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.389 Test: blockdev write read max offset ...passed 00:08:47.389 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.389 Test: blockdev writev readv 8 blocks ...passed 00:08:47.389 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.389 Test: blockdev writev readv block ...passed 00:08:47.389 Test: blockdev writev readv size > 128k ...passed 00:08:47.389 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.389 Test: blockdev comparev and writev ...passed 00:08:47.389 Test: blockdev nvme passthru rw ...passed 00:08:47.389 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.389 Test: blockdev nvme admin passthru ...passed 00:08:47.389 Test: blockdev copy ...passed 00:08:47.389 Suite: bdevio tests on: Malloc0 00:08:47.389 Test: blockdev write read block ...passed 00:08:47.389 Test: blockdev write zeroes read block ...passed 00:08:47.389 Test: blockdev write zeroes read no split ...passed 00:08:47.389 Test: blockdev write zeroes read split ...passed 00:08:47.389 Test: blockdev write zeroes read split partial ...passed 00:08:47.389 Test: blockdev reset ...passed 00:08:47.389 Test: blockdev write read 8 blocks ...passed 00:08:47.389 Test: blockdev write read size > 128k ...passed 00:08:47.389 Test: blockdev write read invalid size ...passed 00:08:47.389 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:47.389 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:47.389 Test: blockdev write read max offset ...passed 00:08:47.389 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:47.389 Test: blockdev writev readv 8 blocks ...passed 00:08:47.389 Test: blockdev writev readv 30 x 1block ...passed 00:08:47.389 Test: blockdev writev readv block ...passed 00:08:47.389 Test: blockdev writev readv size > 128k ...passed 00:08:47.389 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:47.389 Test: blockdev comparev and writev ...passed 00:08:47.389 Test: blockdev nvme passthru rw ...passed 00:08:47.389 Test: blockdev nvme passthru vendor specific ...passed 00:08:47.389 Test: blockdev nvme admin passthru ...passed 00:08:47.389 Test: blockdev copy ...passed 00:08:47.389 00:08:47.389 Run Summary: Type Total Ran Passed Failed Inactive 00:08:47.389 suites 16 16 n/a 0 0 00:08:47.389 tests 368 368 368 0 0 00:08:47.389 asserts 2224 2224 2224 0 n/a 00:08:47.389 00:08:47.389 Elapsed time = 0.501 seconds 00:08:47.389 0 00:08:47.389 10:35:22 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1996634 00:08:47.389 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1996634 ']' 00:08:47.389 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1996634 00:08:47.390 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:08:47.390 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:47.390 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1996634 00:08:47.390 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:47.390 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:47.390 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1996634' 00:08:47.390 killing process with pid 1996634 00:08:47.390 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1996634 00:08:47.390 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1996634 00:08:47.649 10:35:22 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:47.649 00:08:47.649 real 0m1.716s 00:08:47.649 user 0m4.347s 00:08:47.649 sys 0m0.479s 00:08:47.649 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:47.649 10:35:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:47.649 ************************************ 00:08:47.649 END TEST bdev_bounds 00:08:47.649 ************************************ 00:08:47.649 10:35:22 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:47.649 10:35:22 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:47.649 10:35:22 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:47.649 10:35:22 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:47.649 10:35:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:47.907 ************************************ 00:08:47.907 START TEST bdev_nbd 00:08:47.907 ************************************ 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1996847 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1996847 /var/tmp/spdk-nbd.sock 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1996847 ']' 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:47.907 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:47.907 10:35:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:47.907 [2024-07-12 10:35:22.940110] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:08:47.907 [2024-07-12 10:35:22.940182] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:47.907 [2024-07-12 10:35:23.072301] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.166 [2024-07-12 10:35:23.175052] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.166 [2024-07-12 10:35:23.342545] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:48.166 [2024-07-12 10:35:23.342599] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:48.166 [2024-07-12 10:35:23.342614] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:48.166 [2024-07-12 10:35:23.350548] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:48.166 [2024-07-12 10:35:23.350575] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:48.166 [2024-07-12 10:35:23.358558] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:48.166 [2024-07-12 10:35:23.358581] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:48.424 [2024-07-12 10:35:23.432184] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:48.424 [2024-07-12 10:35:23.432236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:48.424 [2024-07-12 10:35:23.432253] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2a60a40 00:08:48.424 [2024-07-12 10:35:23.432266] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:48.424 [2024-07-12 10:35:23.433689] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:48.424 [2024-07-12 10:35:23.433719] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:48.683 10:35:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:48.941 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:48.941 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:48.941 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:48.941 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:48.941 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:48.941 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:48.941 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:48.941 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:49.200 1+0 records in 00:08:49.200 1+0 records out 00:08:49.200 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231928 s, 17.7 MB/s 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:49.200 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:49.458 1+0 records in 00:08:49.458 1+0 records out 00:08:49.458 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272861 s, 15.0 MB/s 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:49.458 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:49.716 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:49.716 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:49.716 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:49.716 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:49.716 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:49.716 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:49.716 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:49.717 1+0 records in 00:08:49.717 1+0 records out 00:08:49.717 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319576 s, 12.8 MB/s 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:49.717 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:49.975 10:35:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:49.975 1+0 records in 00:08:49.975 1+0 records out 00:08:49.975 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000351248 s, 11.7 MB/s 00:08:49.975 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.975 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:49.975 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:49.975 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:49.975 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:49.975 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:49.975 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:49.975 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:50.234 1+0 records in 00:08:50.234 1+0 records out 00:08:50.234 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000395438 s, 10.4 MB/s 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:50.234 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:50.492 1+0 records in 00:08:50.492 1+0 records out 00:08:50.492 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000387387 s, 10.6 MB/s 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:50.492 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:50.750 1+0 records in 00:08:50.750 1+0 records out 00:08:50.750 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000455233 s, 9.0 MB/s 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:50.750 10:35:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.009 1+0 records in 00:08:51.009 1+0 records out 00:08:51.009 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00039272 s, 10.4 MB/s 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:51.009 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.267 1+0 records in 00:08:51.267 1+0 records out 00:08:51.267 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00057973 s, 7.1 MB/s 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:51.267 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:51.525 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:51.525 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.783 1+0 records in 00:08:51.783 1+0 records out 00:08:51.783 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000499665 s, 8.2 MB/s 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:51.783 10:35:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.041 1+0 records in 00:08:52.041 1+0 records out 00:08:52.041 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000567592 s, 7.2 MB/s 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:52.041 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.299 1+0 records in 00:08:52.299 1+0 records out 00:08:52.299 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000508352 s, 8.1 MB/s 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:52.299 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.558 1+0 records in 00:08:52.558 1+0 records out 00:08:52.558 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000630826 s, 6.5 MB/s 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:52.558 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.816 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.816 1+0 records in 00:08:52.817 1+0 records out 00:08:52.817 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000728025 s, 5.6 MB/s 00:08:52.817 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.817 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.817 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.817 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.817 10:35:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.817 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:52.817 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:52.817 10:35:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:53.075 1+0 records in 00:08:53.075 1+0 records out 00:08:53.075 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000896357 s, 4.6 MB/s 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:53.075 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:53.334 1+0 records in 00:08:53.334 1+0 records out 00:08:53.334 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00088915 s, 4.6 MB/s 00:08:53.334 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.335 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:53.335 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.335 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.335 10:35:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:53.335 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:53.335 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:53.335 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:53.593 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd0", 00:08:53.593 "bdev_name": "Malloc0" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd1", 00:08:53.593 "bdev_name": "Malloc1p0" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd2", 00:08:53.593 "bdev_name": "Malloc1p1" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd3", 00:08:53.593 "bdev_name": "Malloc2p0" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd4", 00:08:53.593 "bdev_name": "Malloc2p1" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd5", 00:08:53.593 "bdev_name": "Malloc2p2" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd6", 00:08:53.593 "bdev_name": "Malloc2p3" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd7", 00:08:53.593 "bdev_name": "Malloc2p4" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd8", 00:08:53.593 "bdev_name": "Malloc2p5" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd9", 00:08:53.593 "bdev_name": "Malloc2p6" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd10", 00:08:53.593 "bdev_name": "Malloc2p7" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd11", 00:08:53.593 "bdev_name": "TestPT" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd12", 00:08:53.593 "bdev_name": "raid0" 00:08:53.593 }, 00:08:53.593 { 00:08:53.593 "nbd_device": "/dev/nbd13", 00:08:53.594 "bdev_name": "concat0" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd14", 00:08:53.594 "bdev_name": "raid1" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd15", 00:08:53.594 "bdev_name": "AIO0" 00:08:53.594 } 00:08:53.594 ]' 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd0", 00:08:53.594 "bdev_name": "Malloc0" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd1", 00:08:53.594 "bdev_name": "Malloc1p0" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd2", 00:08:53.594 "bdev_name": "Malloc1p1" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd3", 00:08:53.594 "bdev_name": "Malloc2p0" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd4", 00:08:53.594 "bdev_name": "Malloc2p1" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd5", 00:08:53.594 "bdev_name": "Malloc2p2" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd6", 00:08:53.594 "bdev_name": "Malloc2p3" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd7", 00:08:53.594 "bdev_name": "Malloc2p4" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd8", 00:08:53.594 "bdev_name": "Malloc2p5" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd9", 00:08:53.594 "bdev_name": "Malloc2p6" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd10", 00:08:53.594 "bdev_name": "Malloc2p7" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd11", 00:08:53.594 "bdev_name": "TestPT" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd12", 00:08:53.594 "bdev_name": "raid0" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd13", 00:08:53.594 "bdev_name": "concat0" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd14", 00:08:53.594 "bdev_name": "raid1" 00:08:53.594 }, 00:08:53.594 { 00:08:53.594 "nbd_device": "/dev/nbd15", 00:08:53.594 "bdev_name": "AIO0" 00:08:53.594 } 00:08:53.594 ]' 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:53.594 10:35:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:53.852 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.420 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.690 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:54.964 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:54.964 10:35:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:54.964 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:54.964 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.964 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.964 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:54.964 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:54.964 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.964 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.964 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:55.222 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:55.480 10:35:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:56.046 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:56.305 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:56.564 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:56.823 10:35:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:57.085 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:57.344 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:57.606 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:57.865 10:35:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:58.124 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:58.382 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:58.641 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:58.902 /dev/nbd0 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:58.902 1+0 records in 00:08:58.902 1+0 records out 00:08:58.902 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257953 s, 15.9 MB/s 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:58.902 10:35:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:59.470 /dev/nbd1 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:59.470 1+0 records in 00:08:59.470 1+0 records out 00:08:59.470 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025199 s, 16.3 MB/s 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:59.470 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:59.729 /dev/nbd10 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:59.729 1+0 records in 00:08:59.729 1+0 records out 00:08:59.729 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276555 s, 14.8 MB/s 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:59.729 10:35:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:59.989 /dev/nbd11 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:59.989 1+0 records in 00:08:59.989 1+0 records out 00:08:59.989 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00036082 s, 11.4 MB/s 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:59.989 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:09:00.248 /dev/nbd12 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:00.248 1+0 records in 00:09:00.248 1+0 records out 00:09:00.248 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000335193 s, 12.2 MB/s 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:00.248 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:09:00.508 /dev/nbd13 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:00.508 1+0 records in 00:09:00.508 1+0 records out 00:09:00.508 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000375086 s, 10.9 MB/s 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:00.508 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:09:00.767 /dev/nbd14 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:00.767 1+0 records in 00:09:00.767 1+0 records out 00:09:00.767 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000386206 s, 10.6 MB/s 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:00.767 10:35:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:09:01.027 /dev/nbd15 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:01.027 1+0 records in 00:09:01.027 1+0 records out 00:09:01.027 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000415396 s, 9.9 MB/s 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:01.027 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:09:01.286 /dev/nbd2 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:01.286 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:01.545 1+0 records in 00:09:01.545 1+0 records out 00:09:01.545 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000559477 s, 7.3 MB/s 00:09:01.545 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:01.545 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:01.545 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:01.545 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:01.545 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:01.545 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:01.545 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:01.545 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:09:01.545 /dev/nbd3 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:01.804 1+0 records in 00:09:01.804 1+0 records out 00:09:01.804 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000461267 s, 8.9 MB/s 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:01.804 10:35:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:09:02.064 /dev/nbd4 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:02.064 1+0 records in 00:09:02.064 1+0 records out 00:09:02.064 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000574672 s, 7.1 MB/s 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:02.064 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:09:02.323 /dev/nbd5 00:09:02.323 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:09:02.323 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:09:02.323 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:02.323 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:02.323 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:02.323 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:02.323 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:02.323 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:02.323 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:02.324 1+0 records in 00:09:02.324 1+0 records out 00:09:02.324 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000496906 s, 8.2 MB/s 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:02.324 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:09:02.583 /dev/nbd6 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:02.583 1+0 records in 00:09:02.583 1+0 records out 00:09:02.583 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000553972 s, 7.4 MB/s 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:02.583 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:09:02.841 /dev/nbd7 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:02.841 1+0 records in 00:09:02.841 1+0 records out 00:09:02.841 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000871814 s, 4.7 MB/s 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:02.841 10:35:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:09:03.100 /dev/nbd8 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:03.100 1+0 records in 00:09:03.100 1+0 records out 00:09:03.100 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000543547 s, 7.5 MB/s 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:03.100 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:09:03.358 /dev/nbd9 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:03.358 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:03.358 1+0 records in 00:09:03.358 1+0 records out 00:09:03.358 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0006495 s, 6.3 MB/s 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:03.359 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd0", 00:09:03.618 "bdev_name": "Malloc0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd1", 00:09:03.618 "bdev_name": "Malloc1p0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd10", 00:09:03.618 "bdev_name": "Malloc1p1" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd11", 00:09:03.618 "bdev_name": "Malloc2p0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd12", 00:09:03.618 "bdev_name": "Malloc2p1" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd13", 00:09:03.618 "bdev_name": "Malloc2p2" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd14", 00:09:03.618 "bdev_name": "Malloc2p3" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd15", 00:09:03.618 "bdev_name": "Malloc2p4" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd2", 00:09:03.618 "bdev_name": "Malloc2p5" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd3", 00:09:03.618 "bdev_name": "Malloc2p6" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd4", 00:09:03.618 "bdev_name": "Malloc2p7" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd5", 00:09:03.618 "bdev_name": "TestPT" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd6", 00:09:03.618 "bdev_name": "raid0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd7", 00:09:03.618 "bdev_name": "concat0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd8", 00:09:03.618 "bdev_name": "raid1" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd9", 00:09:03.618 "bdev_name": "AIO0" 00:09:03.618 } 00:09:03.618 ]' 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd0", 00:09:03.618 "bdev_name": "Malloc0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd1", 00:09:03.618 "bdev_name": "Malloc1p0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd10", 00:09:03.618 "bdev_name": "Malloc1p1" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd11", 00:09:03.618 "bdev_name": "Malloc2p0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd12", 00:09:03.618 "bdev_name": "Malloc2p1" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd13", 00:09:03.618 "bdev_name": "Malloc2p2" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd14", 00:09:03.618 "bdev_name": "Malloc2p3" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd15", 00:09:03.618 "bdev_name": "Malloc2p4" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd2", 00:09:03.618 "bdev_name": "Malloc2p5" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd3", 00:09:03.618 "bdev_name": "Malloc2p6" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd4", 00:09:03.618 "bdev_name": "Malloc2p7" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd5", 00:09:03.618 "bdev_name": "TestPT" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd6", 00:09:03.618 "bdev_name": "raid0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd7", 00:09:03.618 "bdev_name": "concat0" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd8", 00:09:03.618 "bdev_name": "raid1" 00:09:03.618 }, 00:09:03.618 { 00:09:03.618 "nbd_device": "/dev/nbd9", 00:09:03.618 "bdev_name": "AIO0" 00:09:03.618 } 00:09:03.618 ]' 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:03.618 /dev/nbd1 00:09:03.618 /dev/nbd10 00:09:03.618 /dev/nbd11 00:09:03.618 /dev/nbd12 00:09:03.618 /dev/nbd13 00:09:03.618 /dev/nbd14 00:09:03.618 /dev/nbd15 00:09:03.618 /dev/nbd2 00:09:03.618 /dev/nbd3 00:09:03.618 /dev/nbd4 00:09:03.618 /dev/nbd5 00:09:03.618 /dev/nbd6 00:09:03.618 /dev/nbd7 00:09:03.618 /dev/nbd8 00:09:03.618 /dev/nbd9' 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:03.618 /dev/nbd1 00:09:03.618 /dev/nbd10 00:09:03.618 /dev/nbd11 00:09:03.618 /dev/nbd12 00:09:03.618 /dev/nbd13 00:09:03.618 /dev/nbd14 00:09:03.618 /dev/nbd15 00:09:03.618 /dev/nbd2 00:09:03.618 /dev/nbd3 00:09:03.618 /dev/nbd4 00:09:03.618 /dev/nbd5 00:09:03.618 /dev/nbd6 00:09:03.618 /dev/nbd7 00:09:03.618 /dev/nbd8 00:09:03.618 /dev/nbd9' 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:09:03.618 256+0 records in 00:09:03.618 256+0 records out 00:09:03.618 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114499 s, 91.6 MB/s 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:03.618 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:03.876 256+0 records in 00:09:03.876 256+0 records out 00:09:03.876 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182655 s, 5.7 MB/s 00:09:03.876 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:03.876 10:35:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:04.134 256+0 records in 00:09:04.134 256+0 records out 00:09:04.134 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.173903 s, 6.0 MB/s 00:09:04.134 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:04.134 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:09:04.392 256+0 records in 00:09:04.392 256+0 records out 00:09:04.392 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184385 s, 5.7 MB/s 00:09:04.392 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:04.392 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:09:04.392 256+0 records in 00:09:04.392 256+0 records out 00:09:04.392 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184498 s, 5.7 MB/s 00:09:04.392 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:04.392 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:09:04.651 256+0 records in 00:09:04.651 256+0 records out 00:09:04.651 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184506 s, 5.7 MB/s 00:09:04.651 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:04.651 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:09:04.908 256+0 records in 00:09:04.908 256+0 records out 00:09:04.908 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184498 s, 5.7 MB/s 00:09:04.908 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:04.908 10:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:09:04.908 256+0 records in 00:09:04.908 256+0 records out 00:09:04.908 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.187138 s, 5.6 MB/s 00:09:04.908 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:04.908 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:09:05.165 256+0 records in 00:09:05.165 256+0 records out 00:09:05.165 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.159607 s, 6.6 MB/s 00:09:05.165 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:05.165 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:09:05.423 256+0 records in 00:09:05.423 256+0 records out 00:09:05.423 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183906 s, 5.7 MB/s 00:09:05.423 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:05.423 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:09:05.681 256+0 records in 00:09:05.681 256+0 records out 00:09:05.681 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184248 s, 5.7 MB/s 00:09:05.681 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:05.681 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:09:05.681 256+0 records in 00:09:05.681 256+0 records out 00:09:05.681 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18429 s, 5.7 MB/s 00:09:05.681 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:05.681 10:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:09:05.939 256+0 records in 00:09:05.939 256+0 records out 00:09:05.939 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184567 s, 5.7 MB/s 00:09:05.939 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:05.939 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:09:06.197 256+0 records in 00:09:06.197 256+0 records out 00:09:06.197 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185118 s, 5.7 MB/s 00:09:06.197 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:06.197 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:09:06.197 256+0 records in 00:09:06.197 256+0 records out 00:09:06.197 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183903 s, 5.7 MB/s 00:09:06.197 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:06.197 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:09:06.454 256+0 records in 00:09:06.454 256+0 records out 00:09:06.454 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.188613 s, 5.6 MB/s 00:09:06.454 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:06.454 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:09:06.710 256+0 records in 00:09:06.710 256+0 records out 00:09:06.710 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183108 s, 5.7 MB/s 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.710 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:06.967 10:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:07.224 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:07.481 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:07.481 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:07.481 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:07.481 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:07.481 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:07.481 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:07.481 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:07.482 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:07.482 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:07.482 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:07.739 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:08.002 10:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:08.284 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:08.551 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:08.809 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:08.809 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:08.809 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:08.810 10:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:09.068 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:09.068 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:09.068 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:09.068 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:09.068 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:09.068 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:09.068 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:09.068 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:09.069 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:09.069 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:09.327 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:09.586 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:09.845 10:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:10.104 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:10.362 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:10.622 10:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:10.880 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:11.139 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:11.398 malloc_lvol_verify 00:09:11.398 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:11.656 5a24dca2-bd0f-4014-aa38-2a2b123a4f18 00:09:11.657 10:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:11.915 e1a8341c-3250-419f-bbeb-b4de79d6fbb3 00:09:11.915 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:09:12.174 /dev/nbd0 00:09:12.174 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:09:12.174 mke2fs 1.46.5 (30-Dec-2021) 00:09:12.174 Discarding device blocks: 0/4096 done 00:09:12.174 Creating filesystem with 4096 1k blocks and 1024 inodes 00:09:12.174 00:09:12.174 Allocating group tables: 0/1 done 00:09:12.174 Writing inode tables: 0/1 done 00:09:12.174 Creating journal (1024 blocks): done 00:09:12.174 Writing superblocks and filesystem accounting information: 0/1 done 00:09:12.174 00:09:12.174 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:09:12.174 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:09:12.174 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:12.174 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:12.174 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:12.174 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:12.174 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.174 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1996847 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1996847 ']' 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1996847 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:12.433 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1996847 00:09:12.692 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:12.692 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:12.692 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1996847' 00:09:12.692 killing process with pid 1996847 00:09:12.692 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1996847 00:09:12.692 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1996847 00:09:12.951 10:35:47 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:09:12.951 00:09:12.951 real 0m25.110s 00:09:12.951 user 0m30.959s 00:09:12.951 sys 0m14.327s 00:09:12.951 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:12.951 10:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:12.951 ************************************ 00:09:12.951 END TEST bdev_nbd 00:09:12.951 ************************************ 00:09:12.951 10:35:48 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:12.951 10:35:48 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:09:12.951 10:35:48 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:09:12.951 10:35:48 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:09:12.951 10:35:48 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:09:12.951 10:35:48 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:12.951 10:35:48 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:12.951 10:35:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:12.951 ************************************ 00:09:12.951 START TEST bdev_fio 00:09:12.951 ************************************ 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:12.951 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:09:12.951 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:12.952 10:35:48 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:13.210 ************************************ 00:09:13.210 START TEST bdev_fio_rw_verify 00:09:13.210 ************************************ 00:09:13.210 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:13.211 10:35:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:13.470 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:13.470 fio-3.35 00:09:13.470 Starting 16 threads 00:09:25.676 00:09:25.676 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=2001041: Fri Jul 12 10:35:59 2024 00:09:25.676 read: IOPS=87.2k, BW=340MiB/s (357MB/s)(3405MiB/10001msec) 00:09:25.676 slat (nsec): min=1878, max=1552.1k, avg=36679.19, stdev=14997.15 00:09:25.676 clat (usec): min=11, max=2047, avg=300.63, stdev=135.60 00:09:25.676 lat (usec): min=19, max=2098, avg=337.31, stdev=143.44 00:09:25.676 clat percentiles (usec): 00:09:25.676 | 50.000th=[ 297], 99.000th=[ 586], 99.900th=[ 685], 99.990th=[ 922], 00:09:25.676 | 99.999th=[ 1483] 00:09:25.676 write: IOPS=137k, BW=534MiB/s (560MB/s)(5277MiB/9877msec); 0 zone resets 00:09:25.676 slat (usec): min=4, max=808, avg=50.47, stdev=15.68 00:09:25.676 clat (usec): min=12, max=1863, avg=356.43, stdev=161.48 00:09:25.676 lat (usec): min=37, max=2096, avg=406.90, stdev=169.45 00:09:25.676 clat percentiles (usec): 00:09:25.676 | 50.000th=[ 343], 99.000th=[ 807], 99.900th=[ 996], 99.990th=[ 1074], 00:09:25.676 | 99.999th=[ 1516] 00:09:25.676 bw ( KiB/s): min=479184, max=718373, per=98.93%, avg=541206.58, stdev=3749.60, samples=304 00:09:25.676 iops : min=119796, max=179591, avg=135301.53, stdev=937.38, samples=304 00:09:25.676 lat (usec) : 20=0.01%, 50=0.42%, 100=4.15%, 250=29.05%, 500=50.99% 00:09:25.676 lat (usec) : 750=14.55%, 1000=0.79% 00:09:25.676 lat (msec) : 2=0.05%, 4=0.01% 00:09:25.676 cpu : usr=99.22%, sys=0.36%, ctx=666, majf=0, minf=2757 00:09:25.676 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:25.676 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:25.676 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:25.676 issued rwts: total=871640,1350825,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:25.676 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:25.676 00:09:25.676 Run status group 0 (all jobs): 00:09:25.676 READ: bw=340MiB/s (357MB/s), 340MiB/s-340MiB/s (357MB/s-357MB/s), io=3405MiB (3570MB), run=10001-10001msec 00:09:25.676 WRITE: bw=534MiB/s (560MB/s), 534MiB/s-534MiB/s (560MB/s-560MB/s), io=5277MiB (5533MB), run=9877-9877msec 00:09:25.676 00:09:25.676 real 0m11.838s 00:09:25.676 user 2m45.062s 00:09:25.676 sys 0m1.352s 00:09:25.676 10:36:00 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:25.676 10:36:00 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:25.676 ************************************ 00:09:25.676 END TEST bdev_fio_rw_verify 00:09:25.676 ************************************ 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:25.676 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:25.677 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:25.677 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:25.677 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:25.677 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:25.678 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "15b59281-4c35-4933-91ef-16eafc81eef6"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "15b59281-4c35-4933-91ef-16eafc81eef6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "ceb39c66-5201-5b78-bb6c-c69e2ce302b7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ceb39c66-5201-5b78-bb6c-c69e2ce302b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "ac07412a-683c-54a4-98ad-26223bdf550d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ac07412a-683c-54a4-98ad-26223bdf550d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "f98efac8-633a-5712-a160-080060a8ce8e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f98efac8-633a-5712-a160-080060a8ce8e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "32c99dfb-c543-50f1-a1d9-7aa25dfbbe99"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "32c99dfb-c543-50f1-a1d9-7aa25dfbbe99",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "74ae2a83-88e4-57f9-8ecf-1132ad4a3954"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "74ae2a83-88e4-57f9-8ecf-1132ad4a3954",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "004a9bce-3f9c-5a71-89e7-5f4eb7fd93ce"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "004a9bce-3f9c-5a71-89e7-5f4eb7fd93ce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "085fe99a-e648-54f6-9146-c2b09eede404"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "085fe99a-e648-54f6-9146-c2b09eede404",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "3fc270a7-627c-5e5d-85cd-efbe5d15dbf9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3fc270a7-627c-5e5d-85cd-efbe5d15dbf9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "0b8ba2b9-3a31-5231-962a-c22cb63135ea"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0b8ba2b9-3a31-5231-962a-c22cb63135ea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "159ca879-9e95-5af0-9b32-e1e1758788d9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "159ca879-9e95-5af0-9b32-e1e1758788d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "925ef954-63bb-58cb-9b3c-34f28223120a"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "925ef954-63bb-58cb-9b3c-34f28223120a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "7fdf3dc8-9001-409f-9dae-db12d81478ba"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "7fdf3dc8-9001-409f-9dae-db12d81478ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "7fdf3dc8-9001-409f-9dae-db12d81478ba",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "2c4bc5a6-be06-485f-a9bd-38830e948838",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "03f4126c-51a7-47d0-a8f8-a911331bffc2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "67398443-3c22-4366-9d66-e2669cdd8da8"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "67398443-3c22-4366-9d66-e2669cdd8da8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "67398443-3c22-4366-9d66-e2669cdd8da8",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "ad700643-ed6a-4847-a5bf-6654bd62a8c0",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a771bc09-fff9-4de1-b2ba-603656ae7491",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "df8c5d96-84fe-44b0-b173-47a03edfdcdb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "df8c5d96-84fe-44b0-b173-47a03edfdcdb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "df8c5d96-84fe-44b0-b173-47a03edfdcdb",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "69856429-6b73-4cee-802c-51ee6d902da3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "119ff60d-10e8-4666-af59-f9a5bab65d2c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "93ca69a0-37c2-4c21-8dfe-0d47ea6d499d"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "93ca69a0-37c2-4c21-8dfe-0d47ea6d499d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:25.678 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:25.678 Malloc1p0 00:09:25.678 Malloc1p1 00:09:25.678 Malloc2p0 00:09:25.678 Malloc2p1 00:09:25.678 Malloc2p2 00:09:25.678 Malloc2p3 00:09:25.678 Malloc2p4 00:09:25.678 Malloc2p5 00:09:25.678 Malloc2p6 00:09:25.678 Malloc2p7 00:09:25.678 TestPT 00:09:25.678 raid0 00:09:25.678 concat0 ]] 00:09:25.678 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "15b59281-4c35-4933-91ef-16eafc81eef6"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "15b59281-4c35-4933-91ef-16eafc81eef6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "ceb39c66-5201-5b78-bb6c-c69e2ce302b7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ceb39c66-5201-5b78-bb6c-c69e2ce302b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "ac07412a-683c-54a4-98ad-26223bdf550d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ac07412a-683c-54a4-98ad-26223bdf550d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "f98efac8-633a-5712-a160-080060a8ce8e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f98efac8-633a-5712-a160-080060a8ce8e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "32c99dfb-c543-50f1-a1d9-7aa25dfbbe99"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "32c99dfb-c543-50f1-a1d9-7aa25dfbbe99",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "74ae2a83-88e4-57f9-8ecf-1132ad4a3954"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "74ae2a83-88e4-57f9-8ecf-1132ad4a3954",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "004a9bce-3f9c-5a71-89e7-5f4eb7fd93ce"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "004a9bce-3f9c-5a71-89e7-5f4eb7fd93ce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "085fe99a-e648-54f6-9146-c2b09eede404"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "085fe99a-e648-54f6-9146-c2b09eede404",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "3fc270a7-627c-5e5d-85cd-efbe5d15dbf9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3fc270a7-627c-5e5d-85cd-efbe5d15dbf9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "0b8ba2b9-3a31-5231-962a-c22cb63135ea"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0b8ba2b9-3a31-5231-962a-c22cb63135ea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "159ca879-9e95-5af0-9b32-e1e1758788d9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "159ca879-9e95-5af0-9b32-e1e1758788d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "925ef954-63bb-58cb-9b3c-34f28223120a"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "925ef954-63bb-58cb-9b3c-34f28223120a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "7fdf3dc8-9001-409f-9dae-db12d81478ba"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "7fdf3dc8-9001-409f-9dae-db12d81478ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "7fdf3dc8-9001-409f-9dae-db12d81478ba",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "2c4bc5a6-be06-485f-a9bd-38830e948838",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "03f4126c-51a7-47d0-a8f8-a911331bffc2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "67398443-3c22-4366-9d66-e2669cdd8da8"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "67398443-3c22-4366-9d66-e2669cdd8da8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "67398443-3c22-4366-9d66-e2669cdd8da8",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "ad700643-ed6a-4847-a5bf-6654bd62a8c0",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a771bc09-fff9-4de1-b2ba-603656ae7491",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "df8c5d96-84fe-44b0-b173-47a03edfdcdb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "df8c5d96-84fe-44b0-b173-47a03edfdcdb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "df8c5d96-84fe-44b0-b173-47a03edfdcdb",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "69856429-6b73-4cee-802c-51ee6d902da3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "119ff60d-10e8-4666-af59-f9a5bab65d2c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "93ca69a0-37c2-4c21-8dfe-0d47ea6d499d"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "93ca69a0-37c2-4c21-8dfe-0d47ea6d499d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:25.679 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:25.680 10:36:00 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:25.680 ************************************ 00:09:25.680 START TEST bdev_fio_trim 00:09:25.680 ************************************ 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:25.680 10:36:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:25.680 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:25.680 fio-3.35 00:09:25.680 Starting 14 threads 00:09:37.885 00:09:37.885 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=2002817: Fri Jul 12 10:36:11 2024 00:09:37.885 write: IOPS=124k, BW=483MiB/s (506MB/s)(4829MiB/10001msec); 0 zone resets 00:09:37.885 slat (usec): min=2, max=1327, avg=40.17, stdev=11.26 00:09:37.885 clat (usec): min=26, max=4439, avg=282.25, stdev=98.63 00:09:37.885 lat (usec): min=36, max=4486, avg=322.42, stdev=103.00 00:09:37.885 clat percentiles (usec): 00:09:37.885 | 50.000th=[ 273], 99.000th=[ 490], 99.900th=[ 545], 99.990th=[ 594], 00:09:37.885 | 99.999th=[ 906] 00:09:37.885 bw ( KiB/s): min=443102, max=655330, per=100.00%, avg=495416.47, stdev=3319.26, samples=266 00:09:37.885 iops : min=110775, max=163830, avg=123853.95, stdev=829.78, samples=266 00:09:37.885 trim: IOPS=124k, BW=483MiB/s (506MB/s)(4829MiB/10001msec); 0 zone resets 00:09:37.885 slat (usec): min=4, max=155, avg=27.10, stdev= 7.22 00:09:37.885 clat (usec): min=7, max=4486, avg=321.83, stdev=104.13 00:09:37.885 lat (usec): min=19, max=4504, avg=348.93, stdev=107.21 00:09:37.885 clat percentiles (usec): 00:09:37.885 | 50.000th=[ 318], 99.000th=[ 537], 99.900th=[ 594], 99.990th=[ 652], 00:09:37.885 | 99.999th=[ 955] 00:09:37.885 bw ( KiB/s): min=443102, max=655338, per=100.00%, avg=495416.05, stdev=3319.37, samples=266 00:09:37.885 iops : min=110775, max=163832, avg=123853.95, stdev=829.81, samples=266 00:09:37.885 lat (usec) : 10=0.01%, 20=0.01%, 50=0.08%, 100=1.05%, 250=33.56% 00:09:37.885 lat (usec) : 500=62.98%, 750=2.33%, 1000=0.01% 00:09:37.885 lat (msec) : 2=0.01%, 10=0.01% 00:09:37.885 cpu : usr=99.60%, sys=0.01%, ctx=569, majf=0, minf=1020 00:09:37.885 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:37.885 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:37.885 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:37.885 issued rwts: total=0,1236247,1236248,0 short=0,0,0,0 dropped=0,0,0,0 00:09:37.885 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:37.885 00:09:37.885 Run status group 0 (all jobs): 00:09:37.885 WRITE: bw=483MiB/s (506MB/s), 483MiB/s-483MiB/s (506MB/s-506MB/s), io=4829MiB (5064MB), run=10001-10001msec 00:09:37.885 TRIM: bw=483MiB/s (506MB/s), 483MiB/s-483MiB/s (506MB/s-506MB/s), io=4829MiB (5064MB), run=10001-10001msec 00:09:37.885 00:09:37.885 real 0m11.427s 00:09:37.885 user 2m25.582s 00:09:37.885 sys 0m0.683s 00:09:37.885 10:36:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:37.885 10:36:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:37.885 ************************************ 00:09:37.885 END TEST bdev_fio_trim 00:09:37.885 ************************************ 00:09:37.885 10:36:11 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:37.885 10:36:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:37.885 10:36:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:37.885 10:36:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:37.885 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:37.885 10:36:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:37.885 00:09:37.885 real 0m23.651s 00:09:37.885 user 5m10.845s 00:09:37.885 sys 0m2.252s 00:09:37.885 10:36:11 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:37.885 10:36:11 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:37.885 ************************************ 00:09:37.885 END TEST bdev_fio 00:09:37.885 ************************************ 00:09:37.885 10:36:11 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:37.885 10:36:11 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:37.885 10:36:11 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:37.885 10:36:11 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:37.885 10:36:11 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:37.885 10:36:11 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:37.885 ************************************ 00:09:37.885 START TEST bdev_verify 00:09:37.885 ************************************ 00:09:37.885 10:36:11 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:37.885 [2024-07-12 10:36:11.843282] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:09:37.885 [2024-07-12 10:36:11.843344] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2004692 ] 00:09:37.885 [2024-07-12 10:36:11.962067] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:37.885 [2024-07-12 10:36:12.060721] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:37.885 [2024-07-12 10:36:12.060726] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:37.886 [2024-07-12 10:36:12.211694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:37.886 [2024-07-12 10:36:12.211753] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:37.886 [2024-07-12 10:36:12.211767] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:37.886 [2024-07-12 10:36:12.219696] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:37.886 [2024-07-12 10:36:12.219723] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:37.886 [2024-07-12 10:36:12.227713] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:37.886 [2024-07-12 10:36:12.227739] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:37.886 [2024-07-12 10:36:12.300817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:37.886 [2024-07-12 10:36:12.300866] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:37.886 [2024-07-12 10:36:12.300884] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b64d0 00:09:37.886 [2024-07-12 10:36:12.300902] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:37.886 [2024-07-12 10:36:12.302523] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:37.886 [2024-07-12 10:36:12.302552] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:37.886 Running I/O for 5 seconds... 00:09:43.216 00:09:43.216 Latency(us) 00:09:43.216 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:43.216 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.216 Verification LBA range: start 0x0 length 0x1000 00:09:43.216 Malloc0 : 5.15 1043.08 4.07 0.00 0.00 122440.04 683.85 443137.34 00:09:43.216 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.216 Verification LBA range: start 0x1000 length 0x1000 00:09:43.216 Malloc0 : 5.14 1020.38 3.99 0.00 0.00 125159.32 601.93 488727.60 00:09:43.216 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.216 Verification LBA range: start 0x0 length 0x800 00:09:43.216 Malloc1p0 : 5.16 546.06 2.13 0.00 0.00 232954.49 3618.73 238892.97 00:09:43.216 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.216 Verification LBA range: start 0x800 length 0x800 00:09:43.216 Malloc1p0 : 5.15 547.28 2.14 0.00 0.00 232474.69 3618.73 238892.97 00:09:43.216 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.216 Verification LBA range: start 0x0 length 0x800 00:09:43.216 Malloc1p1 : 5.16 545.78 2.13 0.00 0.00 232327.44 3547.49 232510.33 00:09:43.216 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.216 Verification LBA range: start 0x800 length 0x800 00:09:43.216 Malloc1p1 : 5.15 547.03 2.14 0.00 0.00 231788.00 3590.23 233422.14 00:09:43.217 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x200 00:09:43.217 Malloc2p0 : 5.16 545.55 2.13 0.00 0.00 231664.79 3704.21 231598.53 00:09:43.217 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x200 length 0x200 00:09:43.217 Malloc2p0 : 5.15 546.78 2.14 0.00 0.00 231121.90 3704.21 232510.33 00:09:43.217 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x200 00:09:43.217 Malloc2p1 : 5.16 545.32 2.13 0.00 0.00 231068.95 3761.20 228863.11 00:09:43.217 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x200 length 0x200 00:09:43.217 Malloc2p1 : 5.15 546.54 2.13 0.00 0.00 230532.18 3789.69 228863.11 00:09:43.217 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x200 00:09:43.217 Malloc2p2 : 5.17 545.09 2.13 0.00 0.00 230437.52 3675.71 227039.50 00:09:43.217 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x200 length 0x200 00:09:43.217 Malloc2p2 : 5.15 546.31 2.13 0.00 0.00 229902.07 3675.71 227039.50 00:09:43.217 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x200 00:09:43.217 Malloc2p3 : 5.17 544.87 2.13 0.00 0.00 229787.26 3433.52 222480.47 00:09:43.217 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x200 length 0x200 00:09:43.217 Malloc2p3 : 5.16 546.00 2.13 0.00 0.00 229299.13 3447.76 223392.28 00:09:43.217 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x200 00:09:43.217 Malloc2p4 : 5.17 544.64 2.13 0.00 0.00 229157.54 3504.75 221568.67 00:09:43.217 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x200 length 0x200 00:09:43.217 Malloc2p4 : 5.27 558.11 2.18 0.00 0.00 223737.04 3533.25 221568.67 00:09:43.217 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x200 00:09:43.217 Malloc2p5 : 5.28 558.01 2.18 0.00 0.00 223143.69 3504.75 220656.86 00:09:43.217 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x200 length 0x200 00:09:43.217 Malloc2p5 : 5.28 557.90 2.18 0.00 0.00 223176.16 3547.49 220656.86 00:09:43.217 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x200 00:09:43.217 Malloc2p6 : 5.28 557.80 2.18 0.00 0.00 222586.76 3447.76 218833.25 00:09:43.217 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x200 length 0x200 00:09:43.217 Malloc2p6 : 5.28 557.69 2.18 0.00 0.00 222601.90 3433.52 219745.06 00:09:43.217 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x200 00:09:43.217 Malloc2p7 : 5.28 557.59 2.18 0.00 0.00 222020.01 3462.01 217009.64 00:09:43.217 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x200 length 0x200 00:09:43.217 Malloc2p7 : 5.28 557.48 2.18 0.00 0.00 222014.22 3476.26 217009.64 00:09:43.217 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x1000 00:09:43.217 TestPT : 5.28 536.63 2.10 0.00 0.00 229088.65 19603.81 217009.64 00:09:43.217 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x1000 length 0x1000 00:09:43.217 TestPT : 5.28 532.84 2.08 0.00 0.00 231318.69 21655.37 289954.06 00:09:43.217 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x2000 00:09:43.217 raid0 : 5.29 557.04 2.18 0.00 0.00 220575.54 3704.21 196949.93 00:09:43.217 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x2000 length 0x2000 00:09:43.217 raid0 : 5.29 556.66 2.17 0.00 0.00 220716.80 3675.71 188743.68 00:09:43.217 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x2000 00:09:43.217 concat0 : 5.29 556.63 2.17 0.00 0.00 220095.06 3533.25 191479.10 00:09:43.217 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x2000 length 0x2000 00:09:43.217 concat0 : 5.29 556.26 2.17 0.00 0.00 220192.18 3547.49 186008.26 00:09:43.217 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x1000 00:09:43.217 raid1 : 5.29 556.22 2.17 0.00 0.00 219505.15 4131.62 184184.65 00:09:43.217 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x1000 length 0x1000 00:09:43.217 raid1 : 5.30 555.90 2.17 0.00 0.00 219631.04 4103.12 177802.02 00:09:43.217 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x0 length 0x4e2 00:09:43.217 AIO0 : 5.30 555.91 2.17 0.00 0.00 218904.79 1773.75 185096.46 00:09:43.217 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:43.217 Verification LBA range: start 0x4e2 length 0x4e2 00:09:43.217 AIO0 : 5.30 555.60 2.17 0.00 0.00 219014.62 1780.87 182361.04 00:09:43.217 =================================================================================================================== 00:09:43.217 Total : 18584.99 72.60 0.00 0.00 214739.30 601.93 488727.60 00:09:43.217 00:09:43.217 real 0m6.534s 00:09:43.217 user 0m12.146s 00:09:43.217 sys 0m0.393s 00:09:43.217 10:36:18 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:43.217 10:36:18 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:43.217 ************************************ 00:09:43.217 END TEST bdev_verify 00:09:43.217 ************************************ 00:09:43.217 10:36:18 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:43.217 10:36:18 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:43.217 10:36:18 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:43.217 10:36:18 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:43.217 10:36:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:43.217 ************************************ 00:09:43.217 START TEST bdev_verify_big_io 00:09:43.217 ************************************ 00:09:43.217 10:36:18 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:43.476 [2024-07-12 10:36:18.455295] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:09:43.476 [2024-07-12 10:36:18.455357] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2005498 ] 00:09:43.476 [2024-07-12 10:36:18.584084] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:43.735 [2024-07-12 10:36:18.687436] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:43.735 [2024-07-12 10:36:18.687441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:43.735 [2024-07-12 10:36:18.842959] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:43.735 [2024-07-12 10:36:18.843011] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:43.735 [2024-07-12 10:36:18.843025] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:43.735 [2024-07-12 10:36:18.850970] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:43.735 [2024-07-12 10:36:18.850998] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:43.735 [2024-07-12 10:36:18.858986] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:43.735 [2024-07-12 10:36:18.859012] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:43.995 [2024-07-12 10:36:18.936061] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:43.995 [2024-07-12 10:36:18.936109] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:43.995 [2024-07-12 10:36:18.936129] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x257f4d0 00:09:43.995 [2024-07-12 10:36:18.936141] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:43.995 [2024-07-12 10:36:18.937767] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:43.995 [2024-07-12 10:36:18.937798] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:43.995 [2024-07-12 10:36:19.097377] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.098507] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.100132] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.101211] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.102711] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.103718] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.105221] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.106723] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.107706] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.109201] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.110219] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.111558] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.112363] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.113643] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.114452] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.115718] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:43.995 [2024-07-12 10:36:19.136941] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:43.995 [2024-07-12 10:36:19.138765] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:43.995 Running I/O for 5 seconds... 00:09:52.118 00:09:52.118 Latency(us) 00:09:52.118 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:52.118 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x100 00:09:52.118 Malloc0 : 5.88 174.29 10.89 0.00 0.00 719668.82 883.31 1925732.62 00:09:52.118 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x100 length 0x100 00:09:52.118 Malloc0 : 5.94 150.91 9.43 0.00 0.00 832163.70 879.75 2275865.82 00:09:52.118 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x80 00:09:52.118 Malloc1p0 : 6.61 36.29 2.27 0.00 0.00 3216849.12 1474.56 5485420.19 00:09:52.118 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x80 length 0x80 00:09:52.118 Malloc1p0 : 6.26 87.50 5.47 0.00 0.00 1347135.91 2550.21 2684354.56 00:09:52.118 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x80 00:09:52.118 Malloc1p1 : 6.62 36.28 2.27 0.00 0.00 3110953.63 1517.30 5281175.82 00:09:52.118 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x80 length 0x80 00:09:52.118 Malloc1p1 : 6.82 35.18 2.20 0.00 0.00 3146429.59 1524.42 5339531.35 00:09:52.118 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x20 00:09:52.118 Malloc2p0 : 6.25 25.59 1.60 0.00 0.00 1114623.99 644.67 2115388.10 00:09:52.118 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x20 length 0x20 00:09:52.118 Malloc2p0 : 6.19 23.28 1.45 0.00 0.00 1207671.23 655.36 1969499.27 00:09:52.118 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x20 00:09:52.118 Malloc2p1 : 6.25 25.58 1.60 0.00 0.00 1104731.79 644.67 2086210.34 00:09:52.118 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x20 length 0x20 00:09:52.118 Malloc2p1 : 6.19 23.27 1.45 0.00 0.00 1197440.53 655.36 1954910.39 00:09:52.118 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x20 00:09:52.118 Malloc2p2 : 6.26 25.57 1.60 0.00 0.00 1095093.28 644.67 2057032.57 00:09:52.118 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x20 length 0x20 00:09:52.118 Malloc2p2 : 6.27 25.53 1.60 0.00 0.00 1099766.57 662.48 1925732.62 00:09:52.118 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x20 00:09:52.118 Malloc2p3 : 6.26 25.57 1.60 0.00 0.00 1084894.95 644.67 2027854.80 00:09:52.118 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x20 length 0x20 00:09:52.118 Malloc2p3 : 6.27 25.53 1.60 0.00 0.00 1089896.88 673.17 1911143.74 00:09:52.118 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x20 00:09:52.118 Malloc2p4 : 6.26 25.56 1.60 0.00 0.00 1075150.04 687.42 1998677.04 00:09:52.118 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x20 length 0x20 00:09:52.118 Malloc2p4 : 6.27 25.52 1.59 0.00 0.00 1080476.12 1332.09 1881965.97 00:09:52.118 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x20 00:09:52.118 Malloc2p5 : 6.26 25.56 1.60 0.00 0.00 1065330.65 641.11 1969499.27 00:09:52.118 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x20 length 0x20 00:09:52.118 Malloc2p5 : 6.27 25.51 1.59 0.00 0.00 1071206.63 655.36 1852788.20 00:09:52.118 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x20 00:09:52.118 Malloc2p6 : 6.26 25.55 1.60 0.00 0.00 1055803.49 648.24 1940321.50 00:09:52.118 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x20 length 0x20 00:09:52.118 Malloc2p6 : 6.27 25.51 1.59 0.00 0.00 1061571.95 641.11 1830904.88 00:09:52.118 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x0 length 0x20 00:09:52.118 Malloc2p7 : 6.26 25.55 1.60 0.00 0.00 1045378.62 644.67 1925732.62 00:09:52.118 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:52.118 Verification LBA range: start 0x20 length 0x20 00:09:52.119 Malloc2p7 : 6.27 25.50 1.59 0.00 0.00 1052390.34 658.92 1801727.11 00:09:52.119 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x0 length 0x100 00:09:52.119 TestPT : 6.71 38.16 2.39 0.00 0.00 2669276.06 1474.56 4872687.08 00:09:52.119 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x100 length 0x100 00:09:52.119 TestPT : 6.87 32.90 2.06 0.00 0.00 3076104.01 84797.89 3486743.15 00:09:52.119 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x0 length 0x200 00:09:52.119 raid0 : 6.51 45.65 2.85 0.00 0.00 2197649.06 1595.66 4697620.48 00:09:52.119 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x200 length 0x200 00:09:52.119 raid0 : 6.87 39.57 2.47 0.00 0.00 2506020.23 1574.29 4755976.01 00:09:52.119 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x0 length 0x200 00:09:52.119 concat0 : 6.62 48.35 3.02 0.00 0.00 2017653.95 1617.03 4522553.88 00:09:52.119 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x200 length 0x200 00:09:52.119 concat0 : 6.83 46.88 2.93 0.00 0.00 2086193.03 1560.04 4580909.41 00:09:52.119 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x0 length 0x100 00:09:52.119 raid1 : 6.88 62.81 3.93 0.00 0.00 1519598.44 2051.56 4347487.28 00:09:52.119 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x100 length 0x100 00:09:52.119 raid1 : 6.87 68.83 4.30 0.00 0.00 1400257.42 2080.06 4405842.81 00:09:52.119 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x0 length 0x4e 00:09:52.119 AIO0 : 6.88 59.88 3.74 0.00 0.00 944297.58 812.08 2801065.63 00:09:52.119 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:52.119 Verification LBA range: start 0x4e length 0x4e 00:09:52.119 AIO0 : 6.88 58.18 3.64 0.00 0.00 983412.24 819.20 2844832.28 00:09:52.119 =================================================================================================================== 00:09:52.119 Total : 1425.85 89.12 0.00 0.00 1466863.28 641.11 5485420.19 00:09:52.119 00:09:52.119 real 0m8.129s 00:09:52.119 user 0m15.304s 00:09:52.119 sys 0m0.419s 00:09:52.119 10:36:26 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:52.119 10:36:26 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:52.119 ************************************ 00:09:52.119 END TEST bdev_verify_big_io 00:09:52.119 ************************************ 00:09:52.119 10:36:26 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:52.119 10:36:26 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:52.119 10:36:26 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:52.119 10:36:26 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.119 10:36:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:52.119 ************************************ 00:09:52.119 START TEST bdev_write_zeroes 00:09:52.119 ************************************ 00:09:52.119 10:36:26 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:52.119 [2024-07-12 10:36:26.668345] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:09:52.119 [2024-07-12 10:36:26.668408] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2006555 ] 00:09:52.119 [2024-07-12 10:36:26.795992] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:52.119 [2024-07-12 10:36:26.896714] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.119 [2024-07-12 10:36:27.059860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:52.119 [2024-07-12 10:36:27.059914] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:52.119 [2024-07-12 10:36:27.059929] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:52.119 [2024-07-12 10:36:27.067869] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:52.119 [2024-07-12 10:36:27.067897] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:52.119 [2024-07-12 10:36:27.075877] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:52.119 [2024-07-12 10:36:27.075901] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:52.119 [2024-07-12 10:36:27.153066] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:52.119 [2024-07-12 10:36:27.153117] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:52.119 [2024-07-12 10:36:27.153136] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fc9c10 00:09:52.119 [2024-07-12 10:36:27.153149] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:52.119 [2024-07-12 10:36:27.154636] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:52.119 [2024-07-12 10:36:27.154665] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:52.378 Running I/O for 1 seconds... 00:09:53.316 00:09:53.316 Latency(us) 00:09:53.316 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:53.316 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc0 : 1.03 4967.32 19.40 0.00 0.00 25736.54 669.61 43082.80 00:09:53.316 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc1p0 : 1.03 4960.14 19.38 0.00 0.00 25728.66 904.68 42170.99 00:09:53.316 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc1p1 : 1.03 4953.00 19.35 0.00 0.00 25708.42 908.24 41259.19 00:09:53.316 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc2p0 : 1.04 4945.83 19.32 0.00 0.00 25686.94 911.81 40575.33 00:09:53.316 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc2p1 : 1.04 4938.75 19.29 0.00 0.00 25669.23 897.56 39663.53 00:09:53.316 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc2p2 : 1.04 4931.66 19.26 0.00 0.00 25644.83 908.24 38751.72 00:09:53.316 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc2p3 : 1.06 4961.63 19.38 0.00 0.00 25435.71 908.24 37839.92 00:09:53.316 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc2p4 : 1.06 4954.66 19.35 0.00 0.00 25416.51 901.12 36928.11 00:09:53.316 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc2p5 : 1.06 4947.75 19.33 0.00 0.00 25398.11 911.81 36016.31 00:09:53.316 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc2p6 : 1.06 4940.76 19.30 0.00 0.00 25379.15 908.24 35104.50 00:09:53.316 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 Malloc2p7 : 1.06 4933.85 19.27 0.00 0.00 25359.28 904.68 34192.70 00:09:53.316 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 TestPT : 1.07 4926.96 19.25 0.00 0.00 25341.28 947.42 33280.89 00:09:53.316 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 raid0 : 1.07 4919.01 19.21 0.00 0.00 25308.85 1624.15 31685.23 00:09:53.316 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 concat0 : 1.07 4911.21 19.18 0.00 0.00 25256.18 1617.03 30089.57 00:09:53.316 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 raid1 : 1.07 4901.48 19.15 0.00 0.00 25190.24 2578.70 27354.16 00:09:53.316 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:53.316 AIO0 : 1.07 4895.57 19.12 0.00 0.00 25100.67 1040.03 26784.28 00:09:53.316 =================================================================================================================== 00:09:53.316 Total : 78989.57 308.55 0.00 0.00 25457.86 669.61 43082.80 00:09:53.884 00:09:53.884 real 0m2.229s 00:09:53.884 user 0m1.841s 00:09:53.884 sys 0m0.336s 00:09:53.884 10:36:28 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:53.884 10:36:28 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:53.884 ************************************ 00:09:53.884 END TEST bdev_write_zeroes 00:09:53.884 ************************************ 00:09:53.884 10:36:28 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:53.884 10:36:28 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:53.884 10:36:28 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:53.884 10:36:28 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:53.884 10:36:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:53.884 ************************************ 00:09:53.884 START TEST bdev_json_nonenclosed 00:09:53.884 ************************************ 00:09:53.884 10:36:28 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:53.884 [2024-07-12 10:36:28.972903] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:09:53.884 [2024-07-12 10:36:28.972961] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2006869 ] 00:09:54.144 [2024-07-12 10:36:29.100766] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.144 [2024-07-12 10:36:29.197443] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:54.144 [2024-07-12 10:36:29.197520] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:54.144 [2024-07-12 10:36:29.197541] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:54.144 [2024-07-12 10:36:29.197554] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:54.144 00:09:54.144 real 0m0.385s 00:09:54.144 user 0m0.233s 00:09:54.144 sys 0m0.149s 00:09:54.144 10:36:29 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:09:54.144 10:36:29 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:54.144 10:36:29 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:54.144 ************************************ 00:09:54.144 END TEST bdev_json_nonenclosed 00:09:54.144 ************************************ 00:09:54.404 10:36:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:54.404 10:36:29 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:09:54.404 10:36:29 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:54.404 10:36:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:54.404 10:36:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:54.404 10:36:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:54.404 ************************************ 00:09:54.404 START TEST bdev_json_nonarray 00:09:54.404 ************************************ 00:09:54.404 10:36:29 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:54.404 [2024-07-12 10:36:29.442865] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:09:54.404 [2024-07-12 10:36:29.442925] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2006970 ] 00:09:54.404 [2024-07-12 10:36:29.569735] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.663 [2024-07-12 10:36:29.671322] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:54.663 [2024-07-12 10:36:29.671399] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:54.663 [2024-07-12 10:36:29.671419] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:54.663 [2024-07-12 10:36:29.671432] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:54.663 00:09:54.663 real 0m0.399s 00:09:54.663 user 0m0.236s 00:09:54.663 sys 0m0.159s 00:09:54.663 10:36:29 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:09:54.663 10:36:29 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:54.663 10:36:29 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:54.663 ************************************ 00:09:54.663 END TEST bdev_json_nonarray 00:09:54.663 ************************************ 00:09:54.663 10:36:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:54.663 10:36:29 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:09:54.663 10:36:29 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:54.663 10:36:29 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:54.663 10:36:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:54.663 10:36:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:54.663 10:36:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:54.922 ************************************ 00:09:54.922 START TEST bdev_qos 00:09:54.922 ************************************ 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=2007076 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 2007076' 00:09:54.922 Process qos testing pid: 2007076 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 2007076 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 2007076 ']' 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:54.922 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:54.922 10:36:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:54.922 [2024-07-12 10:36:29.902300] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:09:54.922 [2024-07-12 10:36:29.902346] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2007076 ] 00:09:54.922 [2024-07-12 10:36:30.005247] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.922 [2024-07-12 10:36:30.113603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.861 Malloc_0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.861 [ 00:09:55.861 { 00:09:55.861 "name": "Malloc_0", 00:09:55.861 "aliases": [ 00:09:55.861 "4f80da81-16f2-4f91-9dbd-357c5a9f1c55" 00:09:55.861 ], 00:09:55.861 "product_name": "Malloc disk", 00:09:55.861 "block_size": 512, 00:09:55.861 "num_blocks": 262144, 00:09:55.861 "uuid": "4f80da81-16f2-4f91-9dbd-357c5a9f1c55", 00:09:55.861 "assigned_rate_limits": { 00:09:55.861 "rw_ios_per_sec": 0, 00:09:55.861 "rw_mbytes_per_sec": 0, 00:09:55.861 "r_mbytes_per_sec": 0, 00:09:55.861 "w_mbytes_per_sec": 0 00:09:55.861 }, 00:09:55.861 "claimed": false, 00:09:55.861 "zoned": false, 00:09:55.861 "supported_io_types": { 00:09:55.861 "read": true, 00:09:55.861 "write": true, 00:09:55.861 "unmap": true, 00:09:55.861 "flush": true, 00:09:55.861 "reset": true, 00:09:55.861 "nvme_admin": false, 00:09:55.861 "nvme_io": false, 00:09:55.861 "nvme_io_md": false, 00:09:55.861 "write_zeroes": true, 00:09:55.861 "zcopy": true, 00:09:55.861 "get_zone_info": false, 00:09:55.861 "zone_management": false, 00:09:55.861 "zone_append": false, 00:09:55.861 "compare": false, 00:09:55.861 "compare_and_write": false, 00:09:55.861 "abort": true, 00:09:55.861 "seek_hole": false, 00:09:55.861 "seek_data": false, 00:09:55.861 "copy": true, 00:09:55.861 "nvme_iov_md": false 00:09:55.861 }, 00:09:55.861 "memory_domains": [ 00:09:55.861 { 00:09:55.861 "dma_device_id": "system", 00:09:55.861 "dma_device_type": 1 00:09:55.861 }, 00:09:55.861 { 00:09:55.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:55.861 "dma_device_type": 2 00:09:55.861 } 00:09:55.861 ], 00:09:55.861 "driver_specific": {} 00:09:55.861 } 00:09:55.861 ] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.861 Null_1 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.861 [ 00:09:55.861 { 00:09:55.861 "name": "Null_1", 00:09:55.861 "aliases": [ 00:09:55.861 "c986e02d-b475-4e1d-8dd3-de513b73128d" 00:09:55.861 ], 00:09:55.861 "product_name": "Null disk", 00:09:55.861 "block_size": 512, 00:09:55.861 "num_blocks": 262144, 00:09:55.861 "uuid": "c986e02d-b475-4e1d-8dd3-de513b73128d", 00:09:55.861 "assigned_rate_limits": { 00:09:55.861 "rw_ios_per_sec": 0, 00:09:55.861 "rw_mbytes_per_sec": 0, 00:09:55.861 "r_mbytes_per_sec": 0, 00:09:55.861 "w_mbytes_per_sec": 0 00:09:55.861 }, 00:09:55.861 "claimed": false, 00:09:55.861 "zoned": false, 00:09:55.861 "supported_io_types": { 00:09:55.861 "read": true, 00:09:55.861 "write": true, 00:09:55.861 "unmap": false, 00:09:55.861 "flush": false, 00:09:55.861 "reset": true, 00:09:55.861 "nvme_admin": false, 00:09:55.861 "nvme_io": false, 00:09:55.861 "nvme_io_md": false, 00:09:55.861 "write_zeroes": true, 00:09:55.861 "zcopy": false, 00:09:55.861 "get_zone_info": false, 00:09:55.861 "zone_management": false, 00:09:55.861 "zone_append": false, 00:09:55.861 "compare": false, 00:09:55.861 "compare_and_write": false, 00:09:55.861 "abort": true, 00:09:55.861 "seek_hole": false, 00:09:55.861 "seek_data": false, 00:09:55.861 "copy": false, 00:09:55.861 "nvme_iov_md": false 00:09:55.861 }, 00:09:55.861 "driver_specific": {} 00:09:55.861 } 00:09:55.861 ] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:55.861 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:55.862 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:55.862 10:36:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:56.120 Running I/O for 60 seconds... 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 62510.90 250043.58 0.00 0.00 250880.00 0.00 0.00 ' 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=62510.90 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 62510 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=62510 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=15000 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 15000 -gt 1000 ']' 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 15000 Malloc_0 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 15000 IOPS Malloc_0 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:01.385 10:36:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:01.385 ************************************ 00:10:01.385 START TEST bdev_qos_iops 00:10:01.385 ************************************ 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 15000 IOPS Malloc_0 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=15000 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:01.385 10:36:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 15000.13 60000.52 0.00 0.00 61320.00 0.00 0.00 ' 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=15000.13 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 15000 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=15000 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=13500 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=16500 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 15000 -lt 13500 ']' 00:10:06.649 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 15000 -gt 16500 ']' 00:10:06.649 00:10:06.649 real 0m5.243s 00:10:06.649 user 0m0.119s 00:10:06.649 sys 0m0.039s 00:10:06.650 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:06.650 10:36:41 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:06.650 ************************************ 00:10:06.650 END TEST bdev_qos_iops 00:10:06.650 ************************************ 00:10:06.650 10:36:41 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:06.650 10:36:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:10:06.650 10:36:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:06.650 10:36:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:06.650 10:36:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:06.650 10:36:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:06.650 10:36:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:06.650 10:36:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 20156.56 80626.24 0.00 0.00 81920.00 0.00 0.00 ' 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=81920.00 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 81920 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=81920 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:11.924 10:36:46 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:11.924 ************************************ 00:10:11.924 START TEST bdev_qos_bw 00:10:11.924 ************************************ 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:11.924 10:36:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2049.00 8195.98 0.00 0.00 8436.00 0.00 0.00 ' 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8436.00 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8436 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8436 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8436 -lt 7372 ']' 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8436 -gt 9011 ']' 00:10:17.198 00:10:17.198 real 0m5.288s 00:10:17.198 user 0m0.097s 00:10:17.198 sys 0m0.047s 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:10:17.198 ************************************ 00:10:17.198 END TEST bdev_qos_bw 00:10:17.198 ************************************ 00:10:17.198 10:36:52 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:17.198 10:36:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:10:17.198 10:36:52 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.198 10:36:52 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:17.198 10:36:52 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.198 10:36:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:10:17.198 10:36:52 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:17.198 10:36:52 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:17.198 10:36:52 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:17.198 ************************************ 00:10:17.198 START TEST bdev_qos_ro_bw 00:10:17.198 ************************************ 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:17.198 10:36:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:22.464 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.70 2046.79 0.00 0.00 2060.00 0.00 0.00 ' 00:10:22.464 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:22.464 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:22.464 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:22.464 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:10:22.464 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:10:22.464 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:10:22.464 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:22.464 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:22.465 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:22.465 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:22.465 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:10:22.465 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:10:22.465 00:10:22.465 real 0m5.186s 00:10:22.465 user 0m0.106s 00:10:22.465 sys 0m0.053s 00:10:22.465 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:22.465 10:36:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:22.465 ************************************ 00:10:22.465 END TEST bdev_qos_ro_bw 00:10:22.465 ************************************ 00:10:22.465 10:36:57 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:22.465 10:36:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:22.465 10:36:57 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.465 10:36:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:23.034 10:36:57 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:23.034 10:36:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:23.034 10:36:57 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:23.034 10:36:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:23.034 00:10:23.034 Latency(us) 00:10:23.034 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:23.034 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:23.034 Malloc_0 : 26.82 20843.79 81.42 0.00 0.00 12168.30 2023.07 503316.48 00:10:23.034 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:23.034 Null_1 : 26.97 20501.45 80.08 0.00 0.00 12456.40 804.95 150447.86 00:10:23.034 =================================================================================================================== 00:10:23.034 Total : 41345.24 161.50 0.00 0.00 12311.55 804.95 503316.48 00:10:23.034 0 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 2007076 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 2007076 ']' 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 2007076 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2007076 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2007076' 00:10:23.034 killing process with pid 2007076 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 2007076 00:10:23.034 Received shutdown signal, test time was about 27.029638 seconds 00:10:23.034 00:10:23.034 Latency(us) 00:10:23.034 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:23.034 =================================================================================================================== 00:10:23.034 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:23.034 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 2007076 00:10:23.294 10:36:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:23.294 00:10:23.294 real 0m28.537s 00:10:23.294 user 0m29.339s 00:10:23.294 sys 0m0.805s 00:10:23.294 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:23.294 10:36:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:23.294 ************************************ 00:10:23.294 END TEST bdev_qos 00:10:23.294 ************************************ 00:10:23.294 10:36:58 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:23.294 10:36:58 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:23.294 10:36:58 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:23.294 10:36:58 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:23.294 10:36:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:23.553 ************************************ 00:10:23.553 START TEST bdev_qd_sampling 00:10:23.553 ************************************ 00:10:23.553 10:36:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:10:23.553 10:36:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:23.553 10:36:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=2010865 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 2010865' 00:10:23.554 Process bdev QD sampling period testing pid: 2010865 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 2010865 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 2010865 ']' 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:23.554 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:23.554 10:36:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:23.554 [2024-07-12 10:36:58.553891] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:10:23.554 [2024-07-12 10:36:58.553964] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2010865 ] 00:10:23.554 [2024-07-12 10:36:58.686828] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:23.812 [2024-07-12 10:36:58.790146] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:23.812 [2024-07-12 10:36:58.790152] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:24.380 Malloc_QD 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:24.380 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:24.381 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:24.381 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:24.381 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:24.381 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:24.381 [ 00:10:24.381 { 00:10:24.381 "name": "Malloc_QD", 00:10:24.381 "aliases": [ 00:10:24.381 "f6bb8849-e9d9-4c4d-884b-11b59e5cf08a" 00:10:24.381 ], 00:10:24.381 "product_name": "Malloc disk", 00:10:24.381 "block_size": 512, 00:10:24.381 "num_blocks": 262144, 00:10:24.381 "uuid": "f6bb8849-e9d9-4c4d-884b-11b59e5cf08a", 00:10:24.381 "assigned_rate_limits": { 00:10:24.381 "rw_ios_per_sec": 0, 00:10:24.381 "rw_mbytes_per_sec": 0, 00:10:24.381 "r_mbytes_per_sec": 0, 00:10:24.381 "w_mbytes_per_sec": 0 00:10:24.381 }, 00:10:24.381 "claimed": false, 00:10:24.381 "zoned": false, 00:10:24.381 "supported_io_types": { 00:10:24.381 "read": true, 00:10:24.381 "write": true, 00:10:24.381 "unmap": true, 00:10:24.381 "flush": true, 00:10:24.381 "reset": true, 00:10:24.381 "nvme_admin": false, 00:10:24.381 "nvme_io": false, 00:10:24.381 "nvme_io_md": false, 00:10:24.381 "write_zeroes": true, 00:10:24.381 "zcopy": true, 00:10:24.381 "get_zone_info": false, 00:10:24.381 "zone_management": false, 00:10:24.381 "zone_append": false, 00:10:24.381 "compare": false, 00:10:24.381 "compare_and_write": false, 00:10:24.381 "abort": true, 00:10:24.381 "seek_hole": false, 00:10:24.381 "seek_data": false, 00:10:24.381 "copy": true, 00:10:24.381 "nvme_iov_md": false 00:10:24.381 }, 00:10:24.381 "memory_domains": [ 00:10:24.381 { 00:10:24.381 "dma_device_id": "system", 00:10:24.381 "dma_device_type": 1 00:10:24.381 }, 00:10:24.381 { 00:10:24.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:24.381 "dma_device_type": 2 00:10:24.381 } 00:10:24.381 ], 00:10:24.381 "driver_specific": {} 00:10:24.381 } 00:10:24.381 ] 00:10:24.381 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:24.381 10:36:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:10:24.381 10:36:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:24.381 10:36:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:24.639 Running I/O for 5 seconds... 00:10:26.544 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:26.544 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:26.545 "tick_rate": 2300000000, 00:10:26.545 "ticks": 4741263549859672, 00:10:26.545 "bdevs": [ 00:10:26.545 { 00:10:26.545 "name": "Malloc_QD", 00:10:26.545 "bytes_read": 770748928, 00:10:26.545 "num_read_ops": 188164, 00:10:26.545 "bytes_written": 0, 00:10:26.545 "num_write_ops": 0, 00:10:26.545 "bytes_unmapped": 0, 00:10:26.545 "num_unmap_ops": 0, 00:10:26.545 "bytes_copied": 0, 00:10:26.545 "num_copy_ops": 0, 00:10:26.545 "read_latency_ticks": 2246933526384, 00:10:26.545 "max_read_latency_ticks": 14739310, 00:10:26.545 "min_read_latency_ticks": 286682, 00:10:26.545 "write_latency_ticks": 0, 00:10:26.545 "max_write_latency_ticks": 0, 00:10:26.545 "min_write_latency_ticks": 0, 00:10:26.545 "unmap_latency_ticks": 0, 00:10:26.545 "max_unmap_latency_ticks": 0, 00:10:26.545 "min_unmap_latency_ticks": 0, 00:10:26.545 "copy_latency_ticks": 0, 00:10:26.545 "max_copy_latency_ticks": 0, 00:10:26.545 "min_copy_latency_ticks": 0, 00:10:26.545 "io_error": {}, 00:10:26.545 "queue_depth_polling_period": 10, 00:10:26.545 "queue_depth": 512, 00:10:26.545 "io_time": 30, 00:10:26.545 "weighted_io_time": 15360 00:10:26.545 } 00:10:26.545 ] 00:10:26.545 }' 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:26.545 00:10:26.545 Latency(us) 00:10:26.545 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:26.545 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:26.545 Malloc_QD : 1.99 48695.83 190.22 0.00 0.00 5244.32 1474.56 6097.70 00:10:26.545 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:26.545 Malloc_QD : 1.99 49813.12 194.58 0.00 0.00 5127.16 954.55 6411.13 00:10:26.545 =================================================================================================================== 00:10:26.545 Total : 98508.94 384.80 0.00 0.00 5185.05 954.55 6411.13 00:10:26.545 0 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 2010865 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 2010865 ']' 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 2010865 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:26.545 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2010865 00:10:26.805 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:26.805 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:26.805 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2010865' 00:10:26.805 killing process with pid 2010865 00:10:26.805 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 2010865 00:10:26.805 Received shutdown signal, test time was about 2.071601 seconds 00:10:26.805 00:10:26.805 Latency(us) 00:10:26.805 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:26.805 =================================================================================================================== 00:10:26.805 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:26.805 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 2010865 00:10:26.805 10:37:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:26.805 00:10:26.805 real 0m3.459s 00:10:26.805 user 0m6.793s 00:10:26.805 sys 0m0.432s 00:10:26.805 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:26.805 10:37:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:26.805 ************************************ 00:10:26.805 END TEST bdev_qd_sampling 00:10:26.805 ************************************ 00:10:26.805 10:37:01 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:26.805 10:37:01 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:26.805 10:37:01 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:26.805 10:37:01 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:26.805 10:37:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:27.064 ************************************ 00:10:27.064 START TEST bdev_error 00:10:27.064 ************************************ 00:10:27.064 10:37:02 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:10:27.064 10:37:02 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:27.064 10:37:02 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:27.064 10:37:02 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:27.064 10:37:02 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=2011325 00:10:27.064 10:37:02 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 2011325' 00:10:27.064 Process error testing pid: 2011325 00:10:27.064 10:37:02 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:27.064 10:37:02 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 2011325 00:10:27.064 10:37:02 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2011325 ']' 00:10:27.064 10:37:02 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:27.064 10:37:02 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:27.064 10:37:02 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:27.064 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:27.064 10:37:02 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:27.064 10:37:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:27.064 [2024-07-12 10:37:02.098035] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:10:27.064 [2024-07-12 10:37:02.098102] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2011325 ] 00:10:27.064 [2024-07-12 10:37:02.217558] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:27.323 [2024-07-12 10:37:02.314977] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:27.890 10:37:03 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:27.890 Dev_1 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:27.890 10:37:03 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:27.890 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:28.149 [ 00:10:28.149 { 00:10:28.149 "name": "Dev_1", 00:10:28.149 "aliases": [ 00:10:28.149 "0b5bc4e6-8bf0-49f4-ab04-5019b494275d" 00:10:28.149 ], 00:10:28.149 "product_name": "Malloc disk", 00:10:28.149 "block_size": 512, 00:10:28.149 "num_blocks": 262144, 00:10:28.149 "uuid": "0b5bc4e6-8bf0-49f4-ab04-5019b494275d", 00:10:28.149 "assigned_rate_limits": { 00:10:28.149 "rw_ios_per_sec": 0, 00:10:28.149 "rw_mbytes_per_sec": 0, 00:10:28.149 "r_mbytes_per_sec": 0, 00:10:28.149 "w_mbytes_per_sec": 0 00:10:28.149 }, 00:10:28.149 "claimed": false, 00:10:28.149 "zoned": false, 00:10:28.149 "supported_io_types": { 00:10:28.149 "read": true, 00:10:28.149 "write": true, 00:10:28.149 "unmap": true, 00:10:28.149 "flush": true, 00:10:28.149 "reset": true, 00:10:28.149 "nvme_admin": false, 00:10:28.149 "nvme_io": false, 00:10:28.149 "nvme_io_md": false, 00:10:28.149 "write_zeroes": true, 00:10:28.149 "zcopy": true, 00:10:28.149 "get_zone_info": false, 00:10:28.149 "zone_management": false, 00:10:28.149 "zone_append": false, 00:10:28.149 "compare": false, 00:10:28.149 "compare_and_write": false, 00:10:28.149 "abort": true, 00:10:28.149 "seek_hole": false, 00:10:28.149 "seek_data": false, 00:10:28.149 "copy": true, 00:10:28.149 "nvme_iov_md": false 00:10:28.149 }, 00:10:28.149 "memory_domains": [ 00:10:28.149 { 00:10:28.149 "dma_device_id": "system", 00:10:28.149 "dma_device_type": 1 00:10:28.149 }, 00:10:28.149 { 00:10:28.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:28.149 "dma_device_type": 2 00:10:28.149 } 00:10:28.149 ], 00:10:28.149 "driver_specific": {} 00:10:28.149 } 00:10:28.149 ] 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:28.149 10:37:03 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:28.149 true 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:28.149 10:37:03 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:28.149 Dev_2 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:28.149 10:37:03 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:28.149 [ 00:10:28.149 { 00:10:28.149 "name": "Dev_2", 00:10:28.149 "aliases": [ 00:10:28.149 "ad3725b5-59fb-4668-b3c1-9fdb66f27230" 00:10:28.149 ], 00:10:28.149 "product_name": "Malloc disk", 00:10:28.149 "block_size": 512, 00:10:28.149 "num_blocks": 262144, 00:10:28.149 "uuid": "ad3725b5-59fb-4668-b3c1-9fdb66f27230", 00:10:28.149 "assigned_rate_limits": { 00:10:28.149 "rw_ios_per_sec": 0, 00:10:28.149 "rw_mbytes_per_sec": 0, 00:10:28.149 "r_mbytes_per_sec": 0, 00:10:28.149 "w_mbytes_per_sec": 0 00:10:28.149 }, 00:10:28.149 "claimed": false, 00:10:28.149 "zoned": false, 00:10:28.149 "supported_io_types": { 00:10:28.149 "read": true, 00:10:28.149 "write": true, 00:10:28.149 "unmap": true, 00:10:28.149 "flush": true, 00:10:28.149 "reset": true, 00:10:28.149 "nvme_admin": false, 00:10:28.149 "nvme_io": false, 00:10:28.149 "nvme_io_md": false, 00:10:28.149 "write_zeroes": true, 00:10:28.149 "zcopy": true, 00:10:28.149 "get_zone_info": false, 00:10:28.149 "zone_management": false, 00:10:28.149 "zone_append": false, 00:10:28.149 "compare": false, 00:10:28.149 "compare_and_write": false, 00:10:28.149 "abort": true, 00:10:28.149 "seek_hole": false, 00:10:28.149 "seek_data": false, 00:10:28.149 "copy": true, 00:10:28.149 "nvme_iov_md": false 00:10:28.149 }, 00:10:28.149 "memory_domains": [ 00:10:28.149 { 00:10:28.149 "dma_device_id": "system", 00:10:28.149 "dma_device_type": 1 00:10:28.149 }, 00:10:28.149 { 00:10:28.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:28.149 "dma_device_type": 2 00:10:28.149 } 00:10:28.149 ], 00:10:28.149 "driver_specific": {} 00:10:28.149 } 00:10:28.149 ] 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:28.149 10:37:03 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:28.149 10:37:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:28.149 10:37:03 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:28.149 10:37:03 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:28.149 Running I/O for 5 seconds... 00:10:29.080 10:37:04 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 2011325 00:10:29.080 10:37:04 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 2011325' 00:10:29.080 Process is existed as continue on error is set. Pid: 2011325 00:10:29.080 10:37:04 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:29.080 10:37:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.080 10:37:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.080 10:37:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.080 10:37:04 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:29.080 10:37:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.080 10:37:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.080 10:37:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.080 10:37:04 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:29.338 Timeout while waiting for response: 00:10:29.338 00:10:29.338 00:10:33.523 00:10:33.523 Latency(us) 00:10:33.523 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:33.523 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:33.523 EE_Dev_1 : 0.90 37336.45 145.85 5.56 0.00 424.91 134.46 698.10 00:10:33.523 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:33.523 Dev_2 : 5.00 80857.22 315.85 0.00 0.00 194.36 68.56 22567.18 00:10:33.523 =================================================================================================================== 00:10:33.523 Total : 118193.67 461.69 5.56 0.00 212.04 68.56 22567.18 00:10:34.114 10:37:09 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 2011325 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 2011325 ']' 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 2011325 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2011325 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2011325' 00:10:34.114 killing process with pid 2011325 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 2011325 00:10:34.114 Received shutdown signal, test time was about 5.000000 seconds 00:10:34.114 00:10:34.114 Latency(us) 00:10:34.114 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:34.114 =================================================================================================================== 00:10:34.114 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:34.114 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 2011325 00:10:34.372 10:37:09 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=2012313 00:10:34.372 10:37:09 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 2012313' 00:10:34.372 Process error testing pid: 2012313 00:10:34.372 10:37:09 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:34.372 10:37:09 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 2012313 00:10:34.372 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2012313 ']' 00:10:34.372 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:34.372 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:34.372 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:34.372 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:34.372 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:34.629 10:37:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:34.629 [2024-07-12 10:37:09.624805] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:10:34.629 [2024-07-12 10:37:09.624872] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2012313 ] 00:10:34.629 [2024-07-12 10:37:09.743769] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:34.886 [2024-07-12 10:37:09.849529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:35.451 10:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:35.451 Dev_1 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:35.451 10:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:35.451 [ 00:10:35.451 { 00:10:35.451 "name": "Dev_1", 00:10:35.451 "aliases": [ 00:10:35.451 "e6b151da-9977-4b13-963a-65c166dd3c7f" 00:10:35.451 ], 00:10:35.451 "product_name": "Malloc disk", 00:10:35.451 "block_size": 512, 00:10:35.451 "num_blocks": 262144, 00:10:35.451 "uuid": "e6b151da-9977-4b13-963a-65c166dd3c7f", 00:10:35.451 "assigned_rate_limits": { 00:10:35.451 "rw_ios_per_sec": 0, 00:10:35.451 "rw_mbytes_per_sec": 0, 00:10:35.451 "r_mbytes_per_sec": 0, 00:10:35.451 "w_mbytes_per_sec": 0 00:10:35.451 }, 00:10:35.451 "claimed": false, 00:10:35.451 "zoned": false, 00:10:35.451 "supported_io_types": { 00:10:35.451 "read": true, 00:10:35.451 "write": true, 00:10:35.451 "unmap": true, 00:10:35.451 "flush": true, 00:10:35.451 "reset": true, 00:10:35.451 "nvme_admin": false, 00:10:35.451 "nvme_io": false, 00:10:35.451 "nvme_io_md": false, 00:10:35.451 "write_zeroes": true, 00:10:35.451 "zcopy": true, 00:10:35.451 "get_zone_info": false, 00:10:35.451 "zone_management": false, 00:10:35.451 "zone_append": false, 00:10:35.451 "compare": false, 00:10:35.451 "compare_and_write": false, 00:10:35.451 "abort": true, 00:10:35.451 "seek_hole": false, 00:10:35.451 "seek_data": false, 00:10:35.451 "copy": true, 00:10:35.451 "nvme_iov_md": false 00:10:35.451 }, 00:10:35.451 "memory_domains": [ 00:10:35.451 { 00:10:35.451 "dma_device_id": "system", 00:10:35.451 "dma_device_type": 1 00:10:35.451 }, 00:10:35.451 { 00:10:35.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.451 "dma_device_type": 2 00:10:35.451 } 00:10:35.451 ], 00:10:35.451 "driver_specific": {} 00:10:35.451 } 00:10:35.451 ] 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:35.451 10:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:35.451 true 00:10:35.451 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:35.451 10:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:35.452 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:35.452 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:35.709 Dev_2 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:35.709 10:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:35.709 [ 00:10:35.709 { 00:10:35.709 "name": "Dev_2", 00:10:35.709 "aliases": [ 00:10:35.709 "0221c7dd-9933-49eb-89b5-9bbfb13a772d" 00:10:35.709 ], 00:10:35.709 "product_name": "Malloc disk", 00:10:35.709 "block_size": 512, 00:10:35.709 "num_blocks": 262144, 00:10:35.709 "uuid": "0221c7dd-9933-49eb-89b5-9bbfb13a772d", 00:10:35.709 "assigned_rate_limits": { 00:10:35.709 "rw_ios_per_sec": 0, 00:10:35.709 "rw_mbytes_per_sec": 0, 00:10:35.709 "r_mbytes_per_sec": 0, 00:10:35.709 "w_mbytes_per_sec": 0 00:10:35.709 }, 00:10:35.709 "claimed": false, 00:10:35.709 "zoned": false, 00:10:35.709 "supported_io_types": { 00:10:35.709 "read": true, 00:10:35.709 "write": true, 00:10:35.709 "unmap": true, 00:10:35.709 "flush": true, 00:10:35.709 "reset": true, 00:10:35.709 "nvme_admin": false, 00:10:35.709 "nvme_io": false, 00:10:35.709 "nvme_io_md": false, 00:10:35.709 "write_zeroes": true, 00:10:35.709 "zcopy": true, 00:10:35.709 "get_zone_info": false, 00:10:35.709 "zone_management": false, 00:10:35.709 "zone_append": false, 00:10:35.709 "compare": false, 00:10:35.709 "compare_and_write": false, 00:10:35.709 "abort": true, 00:10:35.709 "seek_hole": false, 00:10:35.709 "seek_data": false, 00:10:35.709 "copy": true, 00:10:35.709 "nvme_iov_md": false 00:10:35.709 }, 00:10:35.709 "memory_domains": [ 00:10:35.709 { 00:10:35.709 "dma_device_id": "system", 00:10:35.709 "dma_device_type": 1 00:10:35.709 }, 00:10:35.709 { 00:10:35.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.709 "dma_device_type": 2 00:10:35.709 } 00:10:35.709 ], 00:10:35.709 "driver_specific": {} 00:10:35.709 } 00:10:35.709 ] 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:35.709 10:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:35.709 10:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 2012313 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 2012313 00:10:35.709 10:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:35.709 10:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 2012313 00:10:35.709 Running I/O for 5 seconds... 00:10:35.709 task offset: 13536 on job bdev=EE_Dev_1 fails 00:10:35.709 00:10:35.709 Latency(us) 00:10:35.709 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:35.709 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:35.709 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:35.709 EE_Dev_1 : 0.00 29931.97 116.92 6802.72 0.00 362.16 138.02 644.67 00:10:35.709 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:35.709 Dev_2 : 0.00 18359.15 71.72 0.00 0.00 647.68 127.33 1203.87 00:10:35.709 =================================================================================================================== 00:10:35.709 Total : 48291.12 188.64 6802.72 0.00 517.02 127.33 1203.87 00:10:35.709 [2024-07-12 10:37:10.839207] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:35.709 request: 00:10:35.709 { 00:10:35.709 "method": "perform_tests", 00:10:35.709 "req_id": 1 00:10:35.709 } 00:10:35.709 Got JSON-RPC error response 00:10:35.709 response: 00:10:35.709 { 00:10:35.709 "code": -32603, 00:10:35.709 "message": "bdevperf failed with error Operation not permitted" 00:10:35.709 } 00:10:35.968 10:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:35.968 10:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:35.968 10:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:35.968 10:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:35.968 10:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:35.968 10:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:35.968 00:10:35.968 real 0m9.099s 00:10:35.968 user 0m9.493s 00:10:35.968 sys 0m0.906s 00:10:35.968 10:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:35.968 10:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:35.968 ************************************ 00:10:35.968 END TEST bdev_error 00:10:35.968 ************************************ 00:10:36.225 10:37:11 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:36.225 10:37:11 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:36.225 10:37:11 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:36.225 10:37:11 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:36.225 10:37:11 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:36.225 ************************************ 00:10:36.225 START TEST bdev_stat 00:10:36.225 ************************************ 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=2012514 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 2012514' 00:10:36.225 Process Bdev IO statistics testing pid: 2012514 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 2012514 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 2012514 ']' 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:36.225 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:36.225 10:37:11 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:36.225 [2024-07-12 10:37:11.289195] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:10:36.226 [2024-07-12 10:37:11.289272] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2012514 ] 00:10:36.483 [2024-07-12 10:37:11.423301] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:36.483 [2024-07-12 10:37:11.531164] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:36.483 [2024-07-12 10:37:11.531169] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:37.047 Malloc_STAT 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:37.047 [ 00:10:37.047 { 00:10:37.047 "name": "Malloc_STAT", 00:10:37.047 "aliases": [ 00:10:37.047 "39f0937f-babb-4f44-89b0-dcf317dcc013" 00:10:37.047 ], 00:10:37.047 "product_name": "Malloc disk", 00:10:37.047 "block_size": 512, 00:10:37.047 "num_blocks": 262144, 00:10:37.047 "uuid": "39f0937f-babb-4f44-89b0-dcf317dcc013", 00:10:37.047 "assigned_rate_limits": { 00:10:37.047 "rw_ios_per_sec": 0, 00:10:37.047 "rw_mbytes_per_sec": 0, 00:10:37.047 "r_mbytes_per_sec": 0, 00:10:37.047 "w_mbytes_per_sec": 0 00:10:37.047 }, 00:10:37.047 "claimed": false, 00:10:37.047 "zoned": false, 00:10:37.047 "supported_io_types": { 00:10:37.047 "read": true, 00:10:37.047 "write": true, 00:10:37.047 "unmap": true, 00:10:37.047 "flush": true, 00:10:37.047 "reset": true, 00:10:37.047 "nvme_admin": false, 00:10:37.047 "nvme_io": false, 00:10:37.047 "nvme_io_md": false, 00:10:37.047 "write_zeroes": true, 00:10:37.047 "zcopy": true, 00:10:37.047 "get_zone_info": false, 00:10:37.047 "zone_management": false, 00:10:37.047 "zone_append": false, 00:10:37.047 "compare": false, 00:10:37.047 "compare_and_write": false, 00:10:37.047 "abort": true, 00:10:37.047 "seek_hole": false, 00:10:37.047 "seek_data": false, 00:10:37.047 "copy": true, 00:10:37.047 "nvme_iov_md": false 00:10:37.047 }, 00:10:37.047 "memory_domains": [ 00:10:37.047 { 00:10:37.047 "dma_device_id": "system", 00:10:37.047 "dma_device_type": 1 00:10:37.047 }, 00:10:37.047 { 00:10:37.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.047 "dma_device_type": 2 00:10:37.047 } 00:10:37.047 ], 00:10:37.047 "driver_specific": {} 00:10:37.047 } 00:10:37.047 ] 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:37.047 10:37:12 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:37.305 Running I/O for 10 seconds... 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:39.202 "tick_rate": 2300000000, 00:10:39.202 "ticks": 4741292587290340, 00:10:39.202 "bdevs": [ 00:10:39.202 { 00:10:39.202 "name": "Malloc_STAT", 00:10:39.202 "bytes_read": 761311744, 00:10:39.202 "num_read_ops": 185860, 00:10:39.202 "bytes_written": 0, 00:10:39.202 "num_write_ops": 0, 00:10:39.202 "bytes_unmapped": 0, 00:10:39.202 "num_unmap_ops": 0, 00:10:39.202 "bytes_copied": 0, 00:10:39.202 "num_copy_ops": 0, 00:10:39.202 "read_latency_ticks": 2227378837118, 00:10:39.202 "max_read_latency_ticks": 15146286, 00:10:39.202 "min_read_latency_ticks": 279066, 00:10:39.202 "write_latency_ticks": 0, 00:10:39.202 "max_write_latency_ticks": 0, 00:10:39.202 "min_write_latency_ticks": 0, 00:10:39.202 "unmap_latency_ticks": 0, 00:10:39.202 "max_unmap_latency_ticks": 0, 00:10:39.202 "min_unmap_latency_ticks": 0, 00:10:39.202 "copy_latency_ticks": 0, 00:10:39.202 "max_copy_latency_ticks": 0, 00:10:39.202 "min_copy_latency_ticks": 0, 00:10:39.202 "io_error": {} 00:10:39.202 } 00:10:39.202 ] 00:10:39.202 }' 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=185860 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:39.202 "tick_rate": 2300000000, 00:10:39.202 "ticks": 4741292720786892, 00:10:39.202 "name": "Malloc_STAT", 00:10:39.202 "channels": [ 00:10:39.202 { 00:10:39.202 "thread_id": 2, 00:10:39.202 "bytes_read": 386924544, 00:10:39.202 "num_read_ops": 94464, 00:10:39.202 "bytes_written": 0, 00:10:39.202 "num_write_ops": 0, 00:10:39.202 "bytes_unmapped": 0, 00:10:39.202 "num_unmap_ops": 0, 00:10:39.202 "bytes_copied": 0, 00:10:39.202 "num_copy_ops": 0, 00:10:39.202 "read_latency_ticks": 1147429415096, 00:10:39.202 "max_read_latency_ticks": 12984984, 00:10:39.202 "min_read_latency_ticks": 7925910, 00:10:39.202 "write_latency_ticks": 0, 00:10:39.202 "max_write_latency_ticks": 0, 00:10:39.202 "min_write_latency_ticks": 0, 00:10:39.202 "unmap_latency_ticks": 0, 00:10:39.202 "max_unmap_latency_ticks": 0, 00:10:39.202 "min_unmap_latency_ticks": 0, 00:10:39.202 "copy_latency_ticks": 0, 00:10:39.202 "max_copy_latency_ticks": 0, 00:10:39.202 "min_copy_latency_ticks": 0 00:10:39.202 }, 00:10:39.202 { 00:10:39.202 "thread_id": 3, 00:10:39.202 "bytes_read": 398458880, 00:10:39.202 "num_read_ops": 97280, 00:10:39.202 "bytes_written": 0, 00:10:39.202 "num_write_ops": 0, 00:10:39.202 "bytes_unmapped": 0, 00:10:39.202 "num_unmap_ops": 0, 00:10:39.202 "bytes_copied": 0, 00:10:39.202 "num_copy_ops": 0, 00:10:39.202 "read_latency_ticks": 1150399011536, 00:10:39.202 "max_read_latency_ticks": 15146286, 00:10:39.202 "min_read_latency_ticks": 7867736, 00:10:39.202 "write_latency_ticks": 0, 00:10:39.202 "max_write_latency_ticks": 0, 00:10:39.202 "min_write_latency_ticks": 0, 00:10:39.202 "unmap_latency_ticks": 0, 00:10:39.202 "max_unmap_latency_ticks": 0, 00:10:39.202 "min_unmap_latency_ticks": 0, 00:10:39.202 "copy_latency_ticks": 0, 00:10:39.202 "max_copy_latency_ticks": 0, 00:10:39.202 "min_copy_latency_ticks": 0 00:10:39.202 } 00:10:39.202 ] 00:10:39.202 }' 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=94464 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=94464 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=97280 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=191744 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:39.202 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:39.460 "tick_rate": 2300000000, 00:10:39.460 "ticks": 4741292951733646, 00:10:39.460 "bdevs": [ 00:10:39.460 { 00:10:39.460 "name": "Malloc_STAT", 00:10:39.460 "bytes_read": 825274880, 00:10:39.460 "num_read_ops": 201476, 00:10:39.460 "bytes_written": 0, 00:10:39.460 "num_write_ops": 0, 00:10:39.460 "bytes_unmapped": 0, 00:10:39.460 "num_unmap_ops": 0, 00:10:39.460 "bytes_copied": 0, 00:10:39.460 "num_copy_ops": 0, 00:10:39.460 "read_latency_ticks": 2414607508550, 00:10:39.460 "max_read_latency_ticks": 15146286, 00:10:39.460 "min_read_latency_ticks": 279066, 00:10:39.460 "write_latency_ticks": 0, 00:10:39.460 "max_write_latency_ticks": 0, 00:10:39.460 "min_write_latency_ticks": 0, 00:10:39.460 "unmap_latency_ticks": 0, 00:10:39.460 "max_unmap_latency_ticks": 0, 00:10:39.460 "min_unmap_latency_ticks": 0, 00:10:39.460 "copy_latency_ticks": 0, 00:10:39.460 "max_copy_latency_ticks": 0, 00:10:39.460 "min_copy_latency_ticks": 0, 00:10:39.460 "io_error": {} 00:10:39.460 } 00:10:39.460 ] 00:10:39.460 }' 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=201476 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 191744 -lt 185860 ']' 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 191744 -gt 201476 ']' 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:39.460 00:10:39.460 Latency(us) 00:10:39.460 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:39.460 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:39.460 Malloc_STAT : 2.13 48387.36 189.01 0.00 0.00 5277.48 1738.13 5670.29 00:10:39.460 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:39.460 Malloc_STAT : 2.13 49698.92 194.14 0.00 0.00 5138.93 1474.56 6610.59 00:10:39.460 =================================================================================================================== 00:10:39.460 Total : 98086.29 383.15 0.00 0.00 5207.27 1474.56 6610.59 00:10:39.460 0 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 2012514 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 2012514 ']' 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 2012514 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2012514 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2012514' 00:10:39.460 killing process with pid 2012514 00:10:39.460 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 2012514 00:10:39.460 Received shutdown signal, test time was about 2.211554 seconds 00:10:39.460 00:10:39.460 Latency(us) 00:10:39.460 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:39.460 =================================================================================================================== 00:10:39.461 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:39.461 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 2012514 00:10:39.719 10:37:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:39.719 00:10:39.719 real 0m3.538s 00:10:39.719 user 0m6.952s 00:10:39.719 sys 0m0.486s 00:10:39.719 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:39.719 10:37:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:39.719 ************************************ 00:10:39.719 END TEST bdev_stat 00:10:39.719 ************************************ 00:10:39.719 10:37:14 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:39.719 10:37:14 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:39.719 00:10:39.719 real 1m57.315s 00:10:39.719 user 7m12.060s 00:10:39.719 sys 0m23.010s 00:10:39.719 10:37:14 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:39.719 10:37:14 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:39.719 ************************************ 00:10:39.719 END TEST blockdev_general 00:10:39.719 ************************************ 00:10:39.719 10:37:14 -- common/autotest_common.sh@1142 -- # return 0 00:10:39.719 10:37:14 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:39.719 10:37:14 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:39.719 10:37:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:39.719 10:37:14 -- common/autotest_common.sh@10 -- # set +x 00:10:39.719 ************************************ 00:10:39.719 START TEST bdev_raid 00:10:39.719 ************************************ 00:10:39.719 10:37:14 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:39.978 * Looking for test storage... 00:10:39.978 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:39.978 10:37:15 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:39.978 10:37:15 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:39.978 10:37:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:39.978 10:37:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:39.978 10:37:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:39.978 ************************************ 00:10:39.978 START TEST raid_function_test_raid0 00:10:39.978 ************************************ 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=2013128 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2013128' 00:10:39.978 Process raid pid: 2013128 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 2013128 /var/tmp/spdk-raid.sock 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 2013128 ']' 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:39.978 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:39.978 10:37:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:39.978 [2024-07-12 10:37:15.139728] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:10:39.978 [2024-07-12 10:37:15.139793] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:40.237 [2024-07-12 10:37:15.269516] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:40.237 [2024-07-12 10:37:15.372509] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:40.495 [2024-07-12 10:37:15.437899] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:40.495 [2024-07-12 10:37:15.437931] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:41.061 10:37:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:41.061 10:37:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:10:41.061 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:41.061 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:41.061 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:41.061 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:41.061 10:37:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:41.061 [2024-07-12 10:37:16.239896] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:41.061 [2024-07-12 10:37:16.241346] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:41.061 [2024-07-12 10:37:16.241403] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22bdbd0 00:10:41.061 [2024-07-12 10:37:16.241413] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:41.061 [2024-07-12 10:37:16.241622] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22bdb10 00:10:41.062 [2024-07-12 10:37:16.241741] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22bdbd0 00:10:41.062 [2024-07-12 10:37:16.241751] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x22bdbd0 00:10:41.062 [2024-07-12 10:37:16.241850] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:41.062 Base_1 00:10:41.062 Base_2 00:10:41.320 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:41.320 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:41.320 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:41.579 [2024-07-12 10:37:16.737238] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24718e0 00:10:41.579 /dev/nbd0 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:41.579 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:41.837 1+0 records in 00:10:41.837 1+0 records out 00:10:41.837 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250705 s, 16.3 MB/s 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:41.837 10:37:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:42.095 { 00:10:42.095 "nbd_device": "/dev/nbd0", 00:10:42.095 "bdev_name": "raid" 00:10:42.095 } 00:10:42.095 ]' 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:42.095 { 00:10:42.095 "nbd_device": "/dev/nbd0", 00:10:42.095 "bdev_name": "raid" 00:10:42.095 } 00:10:42.095 ]' 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:10:42.095 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:42.096 4096+0 records in 00:10:42.096 4096+0 records out 00:10:42.096 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0279297 s, 75.1 MB/s 00:10:42.096 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:42.354 4096+0 records in 00:10:42.354 4096+0 records out 00:10:42.354 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.20484 s, 10.2 MB/s 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:42.354 128+0 records in 00:10:42.354 128+0 records out 00:10:42.354 65536 bytes (66 kB, 64 KiB) copied, 0.000370126 s, 177 MB/s 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:42.354 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:42.354 2035+0 records in 00:10:42.355 2035+0 records out 00:10:42.355 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0114031 s, 91.4 MB/s 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:42.355 456+0 records in 00:10:42.355 456+0 records out 00:10:42.355 233472 bytes (233 kB, 228 KiB) copied, 0.00272931 s, 85.5 MB/s 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:42.355 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:42.613 [2024-07-12 10:37:17.713258] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:42.613 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:42.872 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:42.872 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:42.872 10:37:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 2013128 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 2013128 ']' 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 2013128 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:42.872 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2013128 00:10:43.130 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:43.130 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:43.130 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2013128' 00:10:43.130 killing process with pid 2013128 00:10:43.130 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 2013128 00:10:43.130 [2024-07-12 10:37:18.075725] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:43.130 [2024-07-12 10:37:18.075789] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:43.130 [2024-07-12 10:37:18.075830] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:43.131 [2024-07-12 10:37:18.075841] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22bdbd0 name raid, state offline 00:10:43.131 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 2013128 00:10:43.131 [2024-07-12 10:37:18.094040] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:43.131 10:37:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:10:43.131 00:10:43.131 real 0m3.236s 00:10:43.131 user 0m4.285s 00:10:43.131 sys 0m1.210s 00:10:43.131 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:43.131 10:37:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:43.131 ************************************ 00:10:43.131 END TEST raid_function_test_raid0 00:10:43.131 ************************************ 00:10:43.389 10:37:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:43.389 10:37:18 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:10:43.389 10:37:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:43.389 10:37:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:43.389 10:37:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:43.389 ************************************ 00:10:43.389 START TEST raid_function_test_concat 00:10:43.389 ************************************ 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=2013719 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2013719' 00:10:43.389 Process raid pid: 2013719 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 2013719 /var/tmp/spdk-raid.sock 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 2013719 ']' 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:43.389 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:43.389 10:37:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:43.389 [2024-07-12 10:37:18.467106] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:10:43.389 [2024-07-12 10:37:18.467179] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:43.648 [2024-07-12 10:37:18.598621] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:43.648 [2024-07-12 10:37:18.699550] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:43.648 [2024-07-12 10:37:18.759593] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:43.648 [2024-07-12 10:37:18.759621] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:44.215 10:37:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:44.215 10:37:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:10:44.215 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:10:44.215 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:10:44.215 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:44.215 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:10:44.215 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:44.474 [2024-07-12 10:37:19.664666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:44.474 [2024-07-12 10:37:19.666139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:44.474 [2024-07-12 10:37:19.666199] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x133abd0 00:10:44.474 [2024-07-12 10:37:19.666210] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:44.474 [2024-07-12 10:37:19.666395] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x133ab10 00:10:44.474 [2024-07-12 10:37:19.666533] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x133abd0 00:10:44.474 [2024-07-12 10:37:19.666543] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x133abd0 00:10:44.474 [2024-07-12 10:37:19.666645] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:44.733 Base_1 00:10:44.733 Base_2 00:10:44.733 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:44.733 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:44.733 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:44.991 10:37:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:44.991 [2024-07-12 10:37:20.097846] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14ee8e0 00:10:44.991 /dev/nbd0 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:44.991 1+0 records in 00:10:44.991 1+0 records out 00:10:44.991 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233524 s, 17.5 MB/s 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:44.991 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:45.249 { 00:10:45.249 "nbd_device": "/dev/nbd0", 00:10:45.249 "bdev_name": "raid" 00:10:45.249 } 00:10:45.249 ]' 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:45.249 { 00:10:45.249 "nbd_device": "/dev/nbd0", 00:10:45.249 "bdev_name": "raid" 00:10:45.249 } 00:10:45.249 ]' 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:45.249 4096+0 records in 00:10:45.249 4096+0 records out 00:10:45.249 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0288915 s, 72.6 MB/s 00:10:45.249 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:45.506 4096+0 records in 00:10:45.506 4096+0 records out 00:10:45.506 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.209165 s, 10.0 MB/s 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:45.506 128+0 records in 00:10:45.506 128+0 records out 00:10:45.506 65536 bytes (66 kB, 64 KiB) copied, 0.000828144 s, 79.1 MB/s 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:45.506 2035+0 records in 00:10:45.506 2035+0 records out 00:10:45.506 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0106831 s, 97.5 MB/s 00:10:45.506 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:45.764 456+0 records in 00:10:45.764 456+0 records out 00:10:45.764 233472 bytes (233 kB, 228 KiB) copied, 0.00266604 s, 87.6 MB/s 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:45.764 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:46.023 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:46.023 [2024-07-12 10:37:20.999264] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:46.023 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:46.023 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:46.023 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:46.023 10:37:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:46.023 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:46.023 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:46.023 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:46.023 10:37:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:46.023 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:46.023 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:46.282 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 2013719 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 2013719 ']' 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 2013719 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2013719 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2013719' 00:10:46.283 killing process with pid 2013719 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 2013719 00:10:46.283 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 2013719 00:10:46.283 [2024-07-12 10:37:21.346716] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:46.283 [2024-07-12 10:37:21.346781] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:46.283 [2024-07-12 10:37:21.346830] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:46.283 [2024-07-12 10:37:21.346842] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x133abd0 name raid, state offline 00:10:46.283 [2024-07-12 10:37:21.377174] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:46.851 10:37:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:10:46.851 00:10:46.851 real 0m3.354s 00:10:46.851 user 0m4.347s 00:10:46.851 sys 0m1.177s 00:10:46.851 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:46.851 10:37:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:46.851 ************************************ 00:10:46.851 END TEST raid_function_test_concat 00:10:46.851 ************************************ 00:10:46.851 10:37:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:46.851 10:37:21 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:10:46.851 10:37:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:46.851 10:37:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:46.851 10:37:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:46.851 ************************************ 00:10:46.851 START TEST raid0_resize_test 00:10:46.851 ************************************ 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=2014184 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 2014184' 00:10:46.851 Process raid pid: 2014184 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 2014184 /var/tmp/spdk-raid.sock 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 2014184 ']' 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:46.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:46.851 10:37:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.851 [2024-07-12 10:37:21.884081] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:10:46.851 [2024-07-12 10:37:21.884141] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:46.851 [2024-07-12 10:37:22.004489] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:47.110 [2024-07-12 10:37:22.114654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:47.110 [2024-07-12 10:37:22.176212] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:47.110 [2024-07-12 10:37:22.176240] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:47.368 10:37:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:47.368 10:37:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:10:47.368 10:37:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:47.653 Base_1 00:10:47.653 10:37:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:47.919 Base_2 00:10:47.919 10:37:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:48.177 [2024-07-12 10:37:23.327557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:48.177 [2024-07-12 10:37:23.328952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:48.177 [2024-07-12 10:37:23.328998] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2089780 00:10:48.177 [2024-07-12 10:37:23.329008] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:48.177 [2024-07-12 10:37:23.329213] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bd5020 00:10:48.177 [2024-07-12 10:37:23.329305] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2089780 00:10:48.177 [2024-07-12 10:37:23.329314] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x2089780 00:10:48.177 [2024-07-12 10:37:23.329421] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:48.177 10:37:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:48.435 [2024-07-12 10:37:23.564169] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:48.435 [2024-07-12 10:37:23.564192] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:48.435 true 00:10:48.435 10:37:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:48.435 10:37:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:10:48.694 [2024-07-12 10:37:23.804957] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:48.694 10:37:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:10:48.694 10:37:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:10:48.694 10:37:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:10:48.694 10:37:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:48.953 [2024-07-12 10:37:24.037405] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:48.953 [2024-07-12 10:37:24.037431] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:48.953 [2024-07-12 10:37:24.037462] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:48.953 true 00:10:48.953 10:37:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:48.953 10:37:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:10:49.212 [2024-07-12 10:37:24.278193] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 2014184 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 2014184 ']' 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 2014184 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2014184 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2014184' 00:10:49.212 killing process with pid 2014184 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 2014184 00:10:49.212 [2024-07-12 10:37:24.345985] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:49.212 [2024-07-12 10:37:24.346035] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:49.212 [2024-07-12 10:37:24.346074] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:49.212 [2024-07-12 10:37:24.346085] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2089780 name Raid, state offline 00:10:49.212 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 2014184 00:10:49.212 [2024-07-12 10:37:24.347319] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:49.473 10:37:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:10:49.473 00:10:49.473 real 0m2.700s 00:10:49.473 user 0m4.567s 00:10:49.473 sys 0m0.639s 00:10:49.473 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:49.473 10:37:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.473 ************************************ 00:10:49.473 END TEST raid0_resize_test 00:10:49.473 ************************************ 00:10:49.473 10:37:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:49.473 10:37:24 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:49.473 10:37:24 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:49.473 10:37:24 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:49.473 10:37:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:49.473 10:37:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:49.473 10:37:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:49.473 ************************************ 00:10:49.473 START TEST raid_state_function_test 00:10:49.473 ************************************ 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2014569 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2014569' 00:10:49.473 Process raid pid: 2014569 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2014569 /var/tmp/spdk-raid.sock 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2014569 ']' 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:49.473 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:49.473 10:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.733 [2024-07-12 10:37:24.678427] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:10:49.733 [2024-07-12 10:37:24.678499] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:49.733 [2024-07-12 10:37:24.798517] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:49.733 [2024-07-12 10:37:24.904353] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:49.993 [2024-07-12 10:37:24.972169] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:49.993 [2024-07-12 10:37:24.972206] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:50.562 10:37:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:50.562 10:37:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:50.562 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:50.821 [2024-07-12 10:37:25.819745] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:50.821 [2024-07-12 10:37:25.819786] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:50.821 [2024-07-12 10:37:25.819797] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:50.821 [2024-07-12 10:37:25.819809] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.822 10:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:51.081 10:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:51.081 "name": "Existed_Raid", 00:10:51.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:51.081 "strip_size_kb": 64, 00:10:51.081 "state": "configuring", 00:10:51.081 "raid_level": "raid0", 00:10:51.081 "superblock": false, 00:10:51.081 "num_base_bdevs": 2, 00:10:51.081 "num_base_bdevs_discovered": 0, 00:10:51.081 "num_base_bdevs_operational": 2, 00:10:51.081 "base_bdevs_list": [ 00:10:51.081 { 00:10:51.081 "name": "BaseBdev1", 00:10:51.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:51.081 "is_configured": false, 00:10:51.081 "data_offset": 0, 00:10:51.081 "data_size": 0 00:10:51.081 }, 00:10:51.081 { 00:10:51.081 "name": "BaseBdev2", 00:10:51.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:51.081 "is_configured": false, 00:10:51.081 "data_offset": 0, 00:10:51.081 "data_size": 0 00:10:51.081 } 00:10:51.081 ] 00:10:51.081 }' 00:10:51.081 10:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:51.081 10:37:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.019 10:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:52.019 [2024-07-12 10:37:27.163162] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:52.019 [2024-07-12 10:37:27.163192] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd9a80 name Existed_Raid, state configuring 00:10:52.019 10:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:52.278 [2024-07-12 10:37:27.407815] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:52.278 [2024-07-12 10:37:27.407845] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:52.278 [2024-07-12 10:37:27.407855] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:52.278 [2024-07-12 10:37:27.407867] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:52.278 10:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:52.538 [2024-07-12 10:37:27.658580] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:52.538 BaseBdev1 00:10:52.538 10:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:52.538 10:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:52.538 10:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:52.538 10:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:52.538 10:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:52.538 10:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:52.538 10:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:52.797 10:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:53.057 [ 00:10:53.057 { 00:10:53.057 "name": "BaseBdev1", 00:10:53.057 "aliases": [ 00:10:53.057 "0da1fadb-1583-4e43-aabb-fb06ff260e6a" 00:10:53.057 ], 00:10:53.057 "product_name": "Malloc disk", 00:10:53.057 "block_size": 512, 00:10:53.057 "num_blocks": 65536, 00:10:53.057 "uuid": "0da1fadb-1583-4e43-aabb-fb06ff260e6a", 00:10:53.057 "assigned_rate_limits": { 00:10:53.057 "rw_ios_per_sec": 0, 00:10:53.057 "rw_mbytes_per_sec": 0, 00:10:53.057 "r_mbytes_per_sec": 0, 00:10:53.057 "w_mbytes_per_sec": 0 00:10:53.057 }, 00:10:53.057 "claimed": true, 00:10:53.057 "claim_type": "exclusive_write", 00:10:53.057 "zoned": false, 00:10:53.057 "supported_io_types": { 00:10:53.057 "read": true, 00:10:53.057 "write": true, 00:10:53.057 "unmap": true, 00:10:53.057 "flush": true, 00:10:53.057 "reset": true, 00:10:53.057 "nvme_admin": false, 00:10:53.057 "nvme_io": false, 00:10:53.057 "nvme_io_md": false, 00:10:53.057 "write_zeroes": true, 00:10:53.057 "zcopy": true, 00:10:53.057 "get_zone_info": false, 00:10:53.057 "zone_management": false, 00:10:53.057 "zone_append": false, 00:10:53.057 "compare": false, 00:10:53.057 "compare_and_write": false, 00:10:53.057 "abort": true, 00:10:53.057 "seek_hole": false, 00:10:53.057 "seek_data": false, 00:10:53.057 "copy": true, 00:10:53.057 "nvme_iov_md": false 00:10:53.057 }, 00:10:53.057 "memory_domains": [ 00:10:53.057 { 00:10:53.057 "dma_device_id": "system", 00:10:53.057 "dma_device_type": 1 00:10:53.057 }, 00:10:53.057 { 00:10:53.057 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.057 "dma_device_type": 2 00:10:53.057 } 00:10:53.057 ], 00:10:53.057 "driver_specific": {} 00:10:53.057 } 00:10:53.057 ] 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.057 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:53.316 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:53.316 "name": "Existed_Raid", 00:10:53.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:53.316 "strip_size_kb": 64, 00:10:53.316 "state": "configuring", 00:10:53.316 "raid_level": "raid0", 00:10:53.316 "superblock": false, 00:10:53.316 "num_base_bdevs": 2, 00:10:53.316 "num_base_bdevs_discovered": 1, 00:10:53.316 "num_base_bdevs_operational": 2, 00:10:53.316 "base_bdevs_list": [ 00:10:53.316 { 00:10:53.316 "name": "BaseBdev1", 00:10:53.316 "uuid": "0da1fadb-1583-4e43-aabb-fb06ff260e6a", 00:10:53.316 "is_configured": true, 00:10:53.316 "data_offset": 0, 00:10:53.316 "data_size": 65536 00:10:53.316 }, 00:10:53.316 { 00:10:53.316 "name": "BaseBdev2", 00:10:53.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:53.316 "is_configured": false, 00:10:53.316 "data_offset": 0, 00:10:53.316 "data_size": 0 00:10:53.316 } 00:10:53.316 ] 00:10:53.316 }' 00:10:53.316 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:53.316 10:37:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:53.884 10:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:54.144 [2024-07-12 10:37:29.222729] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:54.144 [2024-07-12 10:37:29.222770] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd9350 name Existed_Raid, state configuring 00:10:54.144 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:54.403 [2024-07-12 10:37:29.463391] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:54.403 [2024-07-12 10:37:29.464916] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:54.403 [2024-07-12 10:37:29.464949] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.403 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:54.662 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:54.662 "name": "Existed_Raid", 00:10:54.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:54.662 "strip_size_kb": 64, 00:10:54.662 "state": "configuring", 00:10:54.662 "raid_level": "raid0", 00:10:54.662 "superblock": false, 00:10:54.662 "num_base_bdevs": 2, 00:10:54.662 "num_base_bdevs_discovered": 1, 00:10:54.662 "num_base_bdevs_operational": 2, 00:10:54.662 "base_bdevs_list": [ 00:10:54.662 { 00:10:54.662 "name": "BaseBdev1", 00:10:54.662 "uuid": "0da1fadb-1583-4e43-aabb-fb06ff260e6a", 00:10:54.662 "is_configured": true, 00:10:54.662 "data_offset": 0, 00:10:54.662 "data_size": 65536 00:10:54.662 }, 00:10:54.662 { 00:10:54.662 "name": "BaseBdev2", 00:10:54.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:54.662 "is_configured": false, 00:10:54.662 "data_offset": 0, 00:10:54.662 "data_size": 0 00:10:54.662 } 00:10:54.662 ] 00:10:54.662 }' 00:10:54.662 10:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:54.662 10:37:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:55.231 10:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:55.490 [2024-07-12 10:37:30.553727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:55.490 [2024-07-12 10:37:30.553763] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcda000 00:10:55.490 [2024-07-12 10:37:30.553772] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:55.490 [2024-07-12 10:37:30.553957] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbf40c0 00:10:55.490 [2024-07-12 10:37:30.554075] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcda000 00:10:55.490 [2024-07-12 10:37:30.554085] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcda000 00:10:55.490 [2024-07-12 10:37:30.554242] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:55.490 BaseBdev2 00:10:55.490 10:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:55.490 10:37:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:55.490 10:37:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:55.490 10:37:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:55.490 10:37:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:55.490 10:37:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:55.490 10:37:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:55.750 10:37:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:56.009 [ 00:10:56.009 { 00:10:56.009 "name": "BaseBdev2", 00:10:56.009 "aliases": [ 00:10:56.009 "cae80ca4-b2b0-4e28-8605-3dcab9a67613" 00:10:56.009 ], 00:10:56.009 "product_name": "Malloc disk", 00:10:56.009 "block_size": 512, 00:10:56.009 "num_blocks": 65536, 00:10:56.009 "uuid": "cae80ca4-b2b0-4e28-8605-3dcab9a67613", 00:10:56.009 "assigned_rate_limits": { 00:10:56.009 "rw_ios_per_sec": 0, 00:10:56.009 "rw_mbytes_per_sec": 0, 00:10:56.009 "r_mbytes_per_sec": 0, 00:10:56.009 "w_mbytes_per_sec": 0 00:10:56.009 }, 00:10:56.009 "claimed": true, 00:10:56.009 "claim_type": "exclusive_write", 00:10:56.009 "zoned": false, 00:10:56.009 "supported_io_types": { 00:10:56.009 "read": true, 00:10:56.009 "write": true, 00:10:56.009 "unmap": true, 00:10:56.009 "flush": true, 00:10:56.009 "reset": true, 00:10:56.009 "nvme_admin": false, 00:10:56.009 "nvme_io": false, 00:10:56.009 "nvme_io_md": false, 00:10:56.009 "write_zeroes": true, 00:10:56.009 "zcopy": true, 00:10:56.009 "get_zone_info": false, 00:10:56.009 "zone_management": false, 00:10:56.009 "zone_append": false, 00:10:56.009 "compare": false, 00:10:56.009 "compare_and_write": false, 00:10:56.009 "abort": true, 00:10:56.009 "seek_hole": false, 00:10:56.009 "seek_data": false, 00:10:56.009 "copy": true, 00:10:56.009 "nvme_iov_md": false 00:10:56.009 }, 00:10:56.009 "memory_domains": [ 00:10:56.009 { 00:10:56.009 "dma_device_id": "system", 00:10:56.009 "dma_device_type": 1 00:10:56.009 }, 00:10:56.009 { 00:10:56.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:56.009 "dma_device_type": 2 00:10:56.009 } 00:10:56.009 ], 00:10:56.009 "driver_specific": {} 00:10:56.009 } 00:10:56.009 ] 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.009 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:56.269 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:56.269 "name": "Existed_Raid", 00:10:56.269 "uuid": "223546d3-facf-476a-8f0c-2b14f8bbc063", 00:10:56.269 "strip_size_kb": 64, 00:10:56.269 "state": "online", 00:10:56.269 "raid_level": "raid0", 00:10:56.269 "superblock": false, 00:10:56.269 "num_base_bdevs": 2, 00:10:56.269 "num_base_bdevs_discovered": 2, 00:10:56.269 "num_base_bdevs_operational": 2, 00:10:56.269 "base_bdevs_list": [ 00:10:56.269 { 00:10:56.269 "name": "BaseBdev1", 00:10:56.269 "uuid": "0da1fadb-1583-4e43-aabb-fb06ff260e6a", 00:10:56.269 "is_configured": true, 00:10:56.269 "data_offset": 0, 00:10:56.269 "data_size": 65536 00:10:56.269 }, 00:10:56.269 { 00:10:56.269 "name": "BaseBdev2", 00:10:56.269 "uuid": "cae80ca4-b2b0-4e28-8605-3dcab9a67613", 00:10:56.269 "is_configured": true, 00:10:56.269 "data_offset": 0, 00:10:56.269 "data_size": 65536 00:10:56.269 } 00:10:56.269 ] 00:10:56.269 }' 00:10:56.269 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:56.269 10:37:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:56.836 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:56.836 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:56.836 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:56.836 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:56.836 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:56.836 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:56.836 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:56.836 10:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:57.096 [2024-07-12 10:37:32.126191] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:57.096 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:57.096 "name": "Existed_Raid", 00:10:57.096 "aliases": [ 00:10:57.096 "223546d3-facf-476a-8f0c-2b14f8bbc063" 00:10:57.096 ], 00:10:57.096 "product_name": "Raid Volume", 00:10:57.096 "block_size": 512, 00:10:57.096 "num_blocks": 131072, 00:10:57.096 "uuid": "223546d3-facf-476a-8f0c-2b14f8bbc063", 00:10:57.096 "assigned_rate_limits": { 00:10:57.096 "rw_ios_per_sec": 0, 00:10:57.096 "rw_mbytes_per_sec": 0, 00:10:57.096 "r_mbytes_per_sec": 0, 00:10:57.096 "w_mbytes_per_sec": 0 00:10:57.096 }, 00:10:57.096 "claimed": false, 00:10:57.096 "zoned": false, 00:10:57.096 "supported_io_types": { 00:10:57.096 "read": true, 00:10:57.096 "write": true, 00:10:57.096 "unmap": true, 00:10:57.096 "flush": true, 00:10:57.096 "reset": true, 00:10:57.096 "nvme_admin": false, 00:10:57.096 "nvme_io": false, 00:10:57.096 "nvme_io_md": false, 00:10:57.096 "write_zeroes": true, 00:10:57.096 "zcopy": false, 00:10:57.096 "get_zone_info": false, 00:10:57.096 "zone_management": false, 00:10:57.096 "zone_append": false, 00:10:57.096 "compare": false, 00:10:57.096 "compare_and_write": false, 00:10:57.096 "abort": false, 00:10:57.096 "seek_hole": false, 00:10:57.096 "seek_data": false, 00:10:57.096 "copy": false, 00:10:57.096 "nvme_iov_md": false 00:10:57.096 }, 00:10:57.096 "memory_domains": [ 00:10:57.096 { 00:10:57.096 "dma_device_id": "system", 00:10:57.096 "dma_device_type": 1 00:10:57.096 }, 00:10:57.096 { 00:10:57.096 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.096 "dma_device_type": 2 00:10:57.096 }, 00:10:57.096 { 00:10:57.096 "dma_device_id": "system", 00:10:57.096 "dma_device_type": 1 00:10:57.096 }, 00:10:57.096 { 00:10:57.096 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.096 "dma_device_type": 2 00:10:57.096 } 00:10:57.096 ], 00:10:57.096 "driver_specific": { 00:10:57.096 "raid": { 00:10:57.096 "uuid": "223546d3-facf-476a-8f0c-2b14f8bbc063", 00:10:57.096 "strip_size_kb": 64, 00:10:57.096 "state": "online", 00:10:57.096 "raid_level": "raid0", 00:10:57.096 "superblock": false, 00:10:57.096 "num_base_bdevs": 2, 00:10:57.096 "num_base_bdevs_discovered": 2, 00:10:57.096 "num_base_bdevs_operational": 2, 00:10:57.096 "base_bdevs_list": [ 00:10:57.096 { 00:10:57.096 "name": "BaseBdev1", 00:10:57.096 "uuid": "0da1fadb-1583-4e43-aabb-fb06ff260e6a", 00:10:57.096 "is_configured": true, 00:10:57.096 "data_offset": 0, 00:10:57.096 "data_size": 65536 00:10:57.096 }, 00:10:57.096 { 00:10:57.096 "name": "BaseBdev2", 00:10:57.096 "uuid": "cae80ca4-b2b0-4e28-8605-3dcab9a67613", 00:10:57.096 "is_configured": true, 00:10:57.096 "data_offset": 0, 00:10:57.096 "data_size": 65536 00:10:57.096 } 00:10:57.096 ] 00:10:57.096 } 00:10:57.096 } 00:10:57.096 }' 00:10:57.096 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:57.096 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:57.096 BaseBdev2' 00:10:57.096 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:57.096 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:57.096 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:57.355 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:57.355 "name": "BaseBdev1", 00:10:57.355 "aliases": [ 00:10:57.355 "0da1fadb-1583-4e43-aabb-fb06ff260e6a" 00:10:57.355 ], 00:10:57.355 "product_name": "Malloc disk", 00:10:57.355 "block_size": 512, 00:10:57.355 "num_blocks": 65536, 00:10:57.355 "uuid": "0da1fadb-1583-4e43-aabb-fb06ff260e6a", 00:10:57.355 "assigned_rate_limits": { 00:10:57.355 "rw_ios_per_sec": 0, 00:10:57.355 "rw_mbytes_per_sec": 0, 00:10:57.355 "r_mbytes_per_sec": 0, 00:10:57.355 "w_mbytes_per_sec": 0 00:10:57.355 }, 00:10:57.355 "claimed": true, 00:10:57.355 "claim_type": "exclusive_write", 00:10:57.355 "zoned": false, 00:10:57.355 "supported_io_types": { 00:10:57.355 "read": true, 00:10:57.355 "write": true, 00:10:57.355 "unmap": true, 00:10:57.355 "flush": true, 00:10:57.355 "reset": true, 00:10:57.355 "nvme_admin": false, 00:10:57.355 "nvme_io": false, 00:10:57.355 "nvme_io_md": false, 00:10:57.355 "write_zeroes": true, 00:10:57.355 "zcopy": true, 00:10:57.355 "get_zone_info": false, 00:10:57.355 "zone_management": false, 00:10:57.355 "zone_append": false, 00:10:57.355 "compare": false, 00:10:57.355 "compare_and_write": false, 00:10:57.355 "abort": true, 00:10:57.355 "seek_hole": false, 00:10:57.355 "seek_data": false, 00:10:57.355 "copy": true, 00:10:57.355 "nvme_iov_md": false 00:10:57.355 }, 00:10:57.355 "memory_domains": [ 00:10:57.355 { 00:10:57.355 "dma_device_id": "system", 00:10:57.356 "dma_device_type": 1 00:10:57.356 }, 00:10:57.356 { 00:10:57.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.356 "dma_device_type": 2 00:10:57.356 } 00:10:57.356 ], 00:10:57.356 "driver_specific": {} 00:10:57.356 }' 00:10:57.356 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:57.356 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:57.356 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:57.356 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:57.614 10:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:57.873 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:57.873 "name": "BaseBdev2", 00:10:57.873 "aliases": [ 00:10:57.873 "cae80ca4-b2b0-4e28-8605-3dcab9a67613" 00:10:57.873 ], 00:10:57.873 "product_name": "Malloc disk", 00:10:57.873 "block_size": 512, 00:10:57.873 "num_blocks": 65536, 00:10:57.873 "uuid": "cae80ca4-b2b0-4e28-8605-3dcab9a67613", 00:10:57.873 "assigned_rate_limits": { 00:10:57.873 "rw_ios_per_sec": 0, 00:10:57.873 "rw_mbytes_per_sec": 0, 00:10:57.873 "r_mbytes_per_sec": 0, 00:10:57.873 "w_mbytes_per_sec": 0 00:10:57.873 }, 00:10:57.873 "claimed": true, 00:10:57.874 "claim_type": "exclusive_write", 00:10:57.874 "zoned": false, 00:10:57.874 "supported_io_types": { 00:10:57.874 "read": true, 00:10:57.874 "write": true, 00:10:57.874 "unmap": true, 00:10:57.874 "flush": true, 00:10:57.874 "reset": true, 00:10:57.874 "nvme_admin": false, 00:10:57.874 "nvme_io": false, 00:10:57.874 "nvme_io_md": false, 00:10:57.874 "write_zeroes": true, 00:10:57.874 "zcopy": true, 00:10:57.874 "get_zone_info": false, 00:10:57.874 "zone_management": false, 00:10:57.874 "zone_append": false, 00:10:57.874 "compare": false, 00:10:57.874 "compare_and_write": false, 00:10:57.874 "abort": true, 00:10:57.874 "seek_hole": false, 00:10:57.874 "seek_data": false, 00:10:57.874 "copy": true, 00:10:57.874 "nvme_iov_md": false 00:10:57.874 }, 00:10:57.874 "memory_domains": [ 00:10:57.874 { 00:10:57.874 "dma_device_id": "system", 00:10:57.874 "dma_device_type": 1 00:10:57.874 }, 00:10:57.874 { 00:10:57.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.874 "dma_device_type": 2 00:10:57.874 } 00:10:57.874 ], 00:10:57.874 "driver_specific": {} 00:10:57.874 }' 00:10:57.874 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:57.874 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:58.134 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:58.134 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:58.134 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:58.134 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:58.134 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:58.134 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:58.134 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:58.134 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:58.134 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:58.392 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:58.392 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:58.392 [2024-07-12 10:37:33.569791] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:58.392 [2024-07-12 10:37:33.569819] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:58.392 [2024-07-12 10:37:33.569860] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:58.650 "name": "Existed_Raid", 00:10:58.650 "uuid": "223546d3-facf-476a-8f0c-2b14f8bbc063", 00:10:58.650 "strip_size_kb": 64, 00:10:58.650 "state": "offline", 00:10:58.650 "raid_level": "raid0", 00:10:58.650 "superblock": false, 00:10:58.650 "num_base_bdevs": 2, 00:10:58.650 "num_base_bdevs_discovered": 1, 00:10:58.650 "num_base_bdevs_operational": 1, 00:10:58.650 "base_bdevs_list": [ 00:10:58.650 { 00:10:58.650 "name": null, 00:10:58.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:58.650 "is_configured": false, 00:10:58.650 "data_offset": 0, 00:10:58.650 "data_size": 65536 00:10:58.650 }, 00:10:58.650 { 00:10:58.650 "name": "BaseBdev2", 00:10:58.650 "uuid": "cae80ca4-b2b0-4e28-8605-3dcab9a67613", 00:10:58.650 "is_configured": true, 00:10:58.650 "data_offset": 0, 00:10:58.650 "data_size": 65536 00:10:58.650 } 00:10:58.650 ] 00:10:58.650 }' 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:58.650 10:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:59.215 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:59.215 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:59.215 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.215 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:59.474 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:59.474 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:59.474 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:59.733 [2024-07-12 10:37:34.806696] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:59.733 [2024-07-12 10:37:34.806747] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcda000 name Existed_Raid, state offline 00:10:59.733 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:59.733 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:59.733 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.733 10:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2014569 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2014569 ']' 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2014569 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2014569 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2014569' 00:10:59.991 killing process with pid 2014569 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2014569 00:10:59.991 [2024-07-12 10:37:35.130251] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:59.991 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2014569 00:10:59.991 [2024-07-12 10:37:35.131105] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:00.249 00:11:00.249 real 0m10.723s 00:11:00.249 user 0m19.067s 00:11:00.249 sys 0m1.991s 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.249 ************************************ 00:11:00.249 END TEST raid_state_function_test 00:11:00.249 ************************************ 00:11:00.249 10:37:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:00.249 10:37:35 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:11:00.249 10:37:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:00.249 10:37:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:00.249 10:37:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:00.249 ************************************ 00:11:00.249 START TEST raid_state_function_test_sb 00:11:00.249 ************************************ 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:00.249 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2016206 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2016206' 00:11:00.250 Process raid pid: 2016206 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2016206 /var/tmp/spdk-raid.sock 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2016206 ']' 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:00.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:00.250 10:37:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:00.508 [2024-07-12 10:37:35.493408] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:11:00.508 [2024-07-12 10:37:35.493493] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:00.508 [2024-07-12 10:37:35.627067] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:00.765 [2024-07-12 10:37:35.729872] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:00.766 [2024-07-12 10:37:35.796250] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:00.766 [2024-07-12 10:37:35.796289] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:01.331 10:37:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:01.331 10:37:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:01.331 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:01.588 [2024-07-12 10:37:36.655270] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:01.588 [2024-07-12 10:37:36.655313] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:01.588 [2024-07-12 10:37:36.655324] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:01.588 [2024-07-12 10:37:36.655336] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:01.588 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:01.588 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:01.588 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:01.588 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:01.588 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:01.588 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:01.588 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.588 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.589 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.589 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.589 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.589 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:01.856 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.856 "name": "Existed_Raid", 00:11:01.856 "uuid": "2f67ca85-4e4a-4da5-a943-d8c334962af4", 00:11:01.856 "strip_size_kb": 64, 00:11:01.856 "state": "configuring", 00:11:01.856 "raid_level": "raid0", 00:11:01.856 "superblock": true, 00:11:01.856 "num_base_bdevs": 2, 00:11:01.856 "num_base_bdevs_discovered": 0, 00:11:01.856 "num_base_bdevs_operational": 2, 00:11:01.856 "base_bdevs_list": [ 00:11:01.856 { 00:11:01.856 "name": "BaseBdev1", 00:11:01.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.856 "is_configured": false, 00:11:01.856 "data_offset": 0, 00:11:01.856 "data_size": 0 00:11:01.856 }, 00:11:01.856 { 00:11:01.856 "name": "BaseBdev2", 00:11:01.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.856 "is_configured": false, 00:11:01.856 "data_offset": 0, 00:11:01.856 "data_size": 0 00:11:01.856 } 00:11:01.856 ] 00:11:01.856 }' 00:11:01.856 10:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.856 10:37:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:02.439 10:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:02.697 [2024-07-12 10:37:37.754035] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:02.697 [2024-07-12 10:37:37.754069] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e4a80 name Existed_Raid, state configuring 00:11:02.697 10:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:02.954 [2024-07-12 10:37:38.002716] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:02.954 [2024-07-12 10:37:38.002743] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:02.954 [2024-07-12 10:37:38.002753] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:02.954 [2024-07-12 10:37:38.002765] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:02.954 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:03.212 [2024-07-12 10:37:38.253303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:03.212 BaseBdev1 00:11:03.212 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:03.212 10:37:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:03.212 10:37:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:03.212 10:37:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:03.212 10:37:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:03.212 10:37:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:03.212 10:37:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:03.469 10:37:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:03.730 [ 00:11:03.730 { 00:11:03.730 "name": "BaseBdev1", 00:11:03.730 "aliases": [ 00:11:03.730 "29332174-5a25-4f7e-94fa-2b41d67aca64" 00:11:03.730 ], 00:11:03.730 "product_name": "Malloc disk", 00:11:03.730 "block_size": 512, 00:11:03.730 "num_blocks": 65536, 00:11:03.730 "uuid": "29332174-5a25-4f7e-94fa-2b41d67aca64", 00:11:03.730 "assigned_rate_limits": { 00:11:03.730 "rw_ios_per_sec": 0, 00:11:03.730 "rw_mbytes_per_sec": 0, 00:11:03.730 "r_mbytes_per_sec": 0, 00:11:03.730 "w_mbytes_per_sec": 0 00:11:03.730 }, 00:11:03.730 "claimed": true, 00:11:03.730 "claim_type": "exclusive_write", 00:11:03.730 "zoned": false, 00:11:03.730 "supported_io_types": { 00:11:03.730 "read": true, 00:11:03.730 "write": true, 00:11:03.730 "unmap": true, 00:11:03.730 "flush": true, 00:11:03.730 "reset": true, 00:11:03.730 "nvme_admin": false, 00:11:03.730 "nvme_io": false, 00:11:03.730 "nvme_io_md": false, 00:11:03.730 "write_zeroes": true, 00:11:03.730 "zcopy": true, 00:11:03.730 "get_zone_info": false, 00:11:03.730 "zone_management": false, 00:11:03.730 "zone_append": false, 00:11:03.730 "compare": false, 00:11:03.730 "compare_and_write": false, 00:11:03.730 "abort": true, 00:11:03.730 "seek_hole": false, 00:11:03.730 "seek_data": false, 00:11:03.730 "copy": true, 00:11:03.730 "nvme_iov_md": false 00:11:03.730 }, 00:11:03.730 "memory_domains": [ 00:11:03.730 { 00:11:03.730 "dma_device_id": "system", 00:11:03.730 "dma_device_type": 1 00:11:03.730 }, 00:11:03.730 { 00:11:03.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:03.730 "dma_device_type": 2 00:11:03.730 } 00:11:03.730 ], 00:11:03.730 "driver_specific": {} 00:11:03.730 } 00:11:03.730 ] 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.730 10:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:03.990 10:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.990 "name": "Existed_Raid", 00:11:03.990 "uuid": "873cae2e-532d-4f38-ae09-b4da1eb92cb9", 00:11:03.990 "strip_size_kb": 64, 00:11:03.990 "state": "configuring", 00:11:03.990 "raid_level": "raid0", 00:11:03.990 "superblock": true, 00:11:03.990 "num_base_bdevs": 2, 00:11:03.990 "num_base_bdevs_discovered": 1, 00:11:03.990 "num_base_bdevs_operational": 2, 00:11:03.990 "base_bdevs_list": [ 00:11:03.990 { 00:11:03.990 "name": "BaseBdev1", 00:11:03.990 "uuid": "29332174-5a25-4f7e-94fa-2b41d67aca64", 00:11:03.990 "is_configured": true, 00:11:03.990 "data_offset": 2048, 00:11:03.990 "data_size": 63488 00:11:03.990 }, 00:11:03.990 { 00:11:03.990 "name": "BaseBdev2", 00:11:03.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.990 "is_configured": false, 00:11:03.990 "data_offset": 0, 00:11:03.990 "data_size": 0 00:11:03.990 } 00:11:03.990 ] 00:11:03.990 }' 00:11:03.990 10:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.990 10:37:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:04.555 10:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:04.814 [2024-07-12 10:37:39.821677] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:04.814 [2024-07-12 10:37:39.821717] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e4350 name Existed_Raid, state configuring 00:11:04.814 10:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:05.072 [2024-07-12 10:37:40.070383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:05.072 [2024-07-12 10:37:40.071876] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:05.072 [2024-07-12 10:37:40.071909] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:05.072 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:05.331 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:05.331 "name": "Existed_Raid", 00:11:05.331 "uuid": "394feffe-46c1-4fa1-b979-db05cecf5e7c", 00:11:05.331 "strip_size_kb": 64, 00:11:05.331 "state": "configuring", 00:11:05.331 "raid_level": "raid0", 00:11:05.331 "superblock": true, 00:11:05.331 "num_base_bdevs": 2, 00:11:05.331 "num_base_bdevs_discovered": 1, 00:11:05.331 "num_base_bdevs_operational": 2, 00:11:05.331 "base_bdevs_list": [ 00:11:05.331 { 00:11:05.331 "name": "BaseBdev1", 00:11:05.331 "uuid": "29332174-5a25-4f7e-94fa-2b41d67aca64", 00:11:05.331 "is_configured": true, 00:11:05.331 "data_offset": 2048, 00:11:05.331 "data_size": 63488 00:11:05.331 }, 00:11:05.331 { 00:11:05.331 "name": "BaseBdev2", 00:11:05.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:05.331 "is_configured": false, 00:11:05.331 "data_offset": 0, 00:11:05.331 "data_size": 0 00:11:05.331 } 00:11:05.331 ] 00:11:05.331 }' 00:11:05.331 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:05.331 10:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:05.897 10:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:05.897 [2024-07-12 10:37:41.044340] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:05.897 [2024-07-12 10:37:41.044503] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24e5000 00:11:05.898 [2024-07-12 10:37:41.044518] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:05.898 [2024-07-12 10:37:41.044691] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ff0c0 00:11:05.898 [2024-07-12 10:37:41.044806] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24e5000 00:11:05.898 [2024-07-12 10:37:41.044816] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24e5000 00:11:05.898 [2024-07-12 10:37:41.044907] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:05.898 BaseBdev2 00:11:05.898 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:05.898 10:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:05.898 10:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:05.898 10:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:05.898 10:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:05.898 10:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:05.898 10:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:06.156 10:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:06.415 [ 00:11:06.415 { 00:11:06.415 "name": "BaseBdev2", 00:11:06.415 "aliases": [ 00:11:06.415 "9b2416b3-2853-4792-ac2b-67a8f4522acb" 00:11:06.415 ], 00:11:06.415 "product_name": "Malloc disk", 00:11:06.415 "block_size": 512, 00:11:06.415 "num_blocks": 65536, 00:11:06.415 "uuid": "9b2416b3-2853-4792-ac2b-67a8f4522acb", 00:11:06.415 "assigned_rate_limits": { 00:11:06.415 "rw_ios_per_sec": 0, 00:11:06.415 "rw_mbytes_per_sec": 0, 00:11:06.415 "r_mbytes_per_sec": 0, 00:11:06.415 "w_mbytes_per_sec": 0 00:11:06.415 }, 00:11:06.415 "claimed": true, 00:11:06.415 "claim_type": "exclusive_write", 00:11:06.415 "zoned": false, 00:11:06.415 "supported_io_types": { 00:11:06.415 "read": true, 00:11:06.415 "write": true, 00:11:06.415 "unmap": true, 00:11:06.415 "flush": true, 00:11:06.415 "reset": true, 00:11:06.415 "nvme_admin": false, 00:11:06.415 "nvme_io": false, 00:11:06.415 "nvme_io_md": false, 00:11:06.415 "write_zeroes": true, 00:11:06.415 "zcopy": true, 00:11:06.415 "get_zone_info": false, 00:11:06.415 "zone_management": false, 00:11:06.415 "zone_append": false, 00:11:06.415 "compare": false, 00:11:06.415 "compare_and_write": false, 00:11:06.415 "abort": true, 00:11:06.415 "seek_hole": false, 00:11:06.415 "seek_data": false, 00:11:06.415 "copy": true, 00:11:06.415 "nvme_iov_md": false 00:11:06.415 }, 00:11:06.415 "memory_domains": [ 00:11:06.415 { 00:11:06.415 "dma_device_id": "system", 00:11:06.415 "dma_device_type": 1 00:11:06.415 }, 00:11:06.415 { 00:11:06.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.415 "dma_device_type": 2 00:11:06.415 } 00:11:06.415 ], 00:11:06.415 "driver_specific": {} 00:11:06.415 } 00:11:06.415 ] 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:06.415 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:06.674 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:06.674 "name": "Existed_Raid", 00:11:06.674 "uuid": "394feffe-46c1-4fa1-b979-db05cecf5e7c", 00:11:06.674 "strip_size_kb": 64, 00:11:06.674 "state": "online", 00:11:06.674 "raid_level": "raid0", 00:11:06.674 "superblock": true, 00:11:06.674 "num_base_bdevs": 2, 00:11:06.674 "num_base_bdevs_discovered": 2, 00:11:06.674 "num_base_bdevs_operational": 2, 00:11:06.674 "base_bdevs_list": [ 00:11:06.674 { 00:11:06.674 "name": "BaseBdev1", 00:11:06.674 "uuid": "29332174-5a25-4f7e-94fa-2b41d67aca64", 00:11:06.674 "is_configured": true, 00:11:06.674 "data_offset": 2048, 00:11:06.674 "data_size": 63488 00:11:06.674 }, 00:11:06.674 { 00:11:06.674 "name": "BaseBdev2", 00:11:06.674 "uuid": "9b2416b3-2853-4792-ac2b-67a8f4522acb", 00:11:06.674 "is_configured": true, 00:11:06.674 "data_offset": 2048, 00:11:06.674 "data_size": 63488 00:11:06.674 } 00:11:06.674 ] 00:11:06.674 }' 00:11:06.674 10:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:06.674 10:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:07.242 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:07.242 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:07.242 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:07.242 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:07.242 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:07.242 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:07.242 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:07.242 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:07.500 [2024-07-12 10:37:42.628829] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:07.500 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:07.500 "name": "Existed_Raid", 00:11:07.500 "aliases": [ 00:11:07.500 "394feffe-46c1-4fa1-b979-db05cecf5e7c" 00:11:07.500 ], 00:11:07.500 "product_name": "Raid Volume", 00:11:07.500 "block_size": 512, 00:11:07.500 "num_blocks": 126976, 00:11:07.500 "uuid": "394feffe-46c1-4fa1-b979-db05cecf5e7c", 00:11:07.500 "assigned_rate_limits": { 00:11:07.500 "rw_ios_per_sec": 0, 00:11:07.500 "rw_mbytes_per_sec": 0, 00:11:07.500 "r_mbytes_per_sec": 0, 00:11:07.500 "w_mbytes_per_sec": 0 00:11:07.500 }, 00:11:07.500 "claimed": false, 00:11:07.500 "zoned": false, 00:11:07.500 "supported_io_types": { 00:11:07.500 "read": true, 00:11:07.500 "write": true, 00:11:07.501 "unmap": true, 00:11:07.501 "flush": true, 00:11:07.501 "reset": true, 00:11:07.501 "nvme_admin": false, 00:11:07.501 "nvme_io": false, 00:11:07.501 "nvme_io_md": false, 00:11:07.501 "write_zeroes": true, 00:11:07.501 "zcopy": false, 00:11:07.501 "get_zone_info": false, 00:11:07.501 "zone_management": false, 00:11:07.501 "zone_append": false, 00:11:07.501 "compare": false, 00:11:07.501 "compare_and_write": false, 00:11:07.501 "abort": false, 00:11:07.501 "seek_hole": false, 00:11:07.501 "seek_data": false, 00:11:07.501 "copy": false, 00:11:07.501 "nvme_iov_md": false 00:11:07.501 }, 00:11:07.501 "memory_domains": [ 00:11:07.501 { 00:11:07.501 "dma_device_id": "system", 00:11:07.501 "dma_device_type": 1 00:11:07.501 }, 00:11:07.501 { 00:11:07.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.501 "dma_device_type": 2 00:11:07.501 }, 00:11:07.501 { 00:11:07.501 "dma_device_id": "system", 00:11:07.501 "dma_device_type": 1 00:11:07.501 }, 00:11:07.501 { 00:11:07.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.501 "dma_device_type": 2 00:11:07.501 } 00:11:07.501 ], 00:11:07.501 "driver_specific": { 00:11:07.501 "raid": { 00:11:07.501 "uuid": "394feffe-46c1-4fa1-b979-db05cecf5e7c", 00:11:07.501 "strip_size_kb": 64, 00:11:07.501 "state": "online", 00:11:07.501 "raid_level": "raid0", 00:11:07.501 "superblock": true, 00:11:07.501 "num_base_bdevs": 2, 00:11:07.501 "num_base_bdevs_discovered": 2, 00:11:07.501 "num_base_bdevs_operational": 2, 00:11:07.501 "base_bdevs_list": [ 00:11:07.501 { 00:11:07.501 "name": "BaseBdev1", 00:11:07.501 "uuid": "29332174-5a25-4f7e-94fa-2b41d67aca64", 00:11:07.501 "is_configured": true, 00:11:07.501 "data_offset": 2048, 00:11:07.501 "data_size": 63488 00:11:07.501 }, 00:11:07.501 { 00:11:07.501 "name": "BaseBdev2", 00:11:07.501 "uuid": "9b2416b3-2853-4792-ac2b-67a8f4522acb", 00:11:07.501 "is_configured": true, 00:11:07.501 "data_offset": 2048, 00:11:07.501 "data_size": 63488 00:11:07.501 } 00:11:07.501 ] 00:11:07.501 } 00:11:07.501 } 00:11:07.501 }' 00:11:07.501 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:07.760 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:07.760 BaseBdev2' 00:11:07.760 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:07.760 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:07.760 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:07.760 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:07.760 "name": "BaseBdev1", 00:11:07.760 "aliases": [ 00:11:07.760 "29332174-5a25-4f7e-94fa-2b41d67aca64" 00:11:07.760 ], 00:11:07.760 "product_name": "Malloc disk", 00:11:07.760 "block_size": 512, 00:11:07.760 "num_blocks": 65536, 00:11:07.760 "uuid": "29332174-5a25-4f7e-94fa-2b41d67aca64", 00:11:07.760 "assigned_rate_limits": { 00:11:07.760 "rw_ios_per_sec": 0, 00:11:07.760 "rw_mbytes_per_sec": 0, 00:11:07.760 "r_mbytes_per_sec": 0, 00:11:07.760 "w_mbytes_per_sec": 0 00:11:07.760 }, 00:11:07.760 "claimed": true, 00:11:07.760 "claim_type": "exclusive_write", 00:11:07.760 "zoned": false, 00:11:07.760 "supported_io_types": { 00:11:07.760 "read": true, 00:11:07.760 "write": true, 00:11:07.760 "unmap": true, 00:11:07.760 "flush": true, 00:11:07.760 "reset": true, 00:11:07.760 "nvme_admin": false, 00:11:07.760 "nvme_io": false, 00:11:07.760 "nvme_io_md": false, 00:11:07.760 "write_zeroes": true, 00:11:07.760 "zcopy": true, 00:11:07.760 "get_zone_info": false, 00:11:07.760 "zone_management": false, 00:11:07.760 "zone_append": false, 00:11:07.760 "compare": false, 00:11:07.760 "compare_and_write": false, 00:11:07.760 "abort": true, 00:11:07.760 "seek_hole": false, 00:11:07.760 "seek_data": false, 00:11:07.760 "copy": true, 00:11:07.760 "nvme_iov_md": false 00:11:07.760 }, 00:11:07.760 "memory_domains": [ 00:11:07.760 { 00:11:07.760 "dma_device_id": "system", 00:11:07.760 "dma_device_type": 1 00:11:07.760 }, 00:11:07.760 { 00:11:07.760 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.760 "dma_device_type": 2 00:11:07.760 } 00:11:07.760 ], 00:11:07.760 "driver_specific": {} 00:11:07.760 }' 00:11:07.760 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:08.017 10:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:08.017 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:08.017 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:08.017 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:08.017 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:08.017 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:08.017 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:08.017 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:08.017 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:08.274 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:08.274 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:08.275 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:08.275 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:08.275 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:08.533 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:08.533 "name": "BaseBdev2", 00:11:08.533 "aliases": [ 00:11:08.533 "9b2416b3-2853-4792-ac2b-67a8f4522acb" 00:11:08.533 ], 00:11:08.533 "product_name": "Malloc disk", 00:11:08.533 "block_size": 512, 00:11:08.533 "num_blocks": 65536, 00:11:08.533 "uuid": "9b2416b3-2853-4792-ac2b-67a8f4522acb", 00:11:08.533 "assigned_rate_limits": { 00:11:08.533 "rw_ios_per_sec": 0, 00:11:08.533 "rw_mbytes_per_sec": 0, 00:11:08.533 "r_mbytes_per_sec": 0, 00:11:08.533 "w_mbytes_per_sec": 0 00:11:08.533 }, 00:11:08.533 "claimed": true, 00:11:08.533 "claim_type": "exclusive_write", 00:11:08.533 "zoned": false, 00:11:08.533 "supported_io_types": { 00:11:08.533 "read": true, 00:11:08.533 "write": true, 00:11:08.533 "unmap": true, 00:11:08.533 "flush": true, 00:11:08.533 "reset": true, 00:11:08.533 "nvme_admin": false, 00:11:08.533 "nvme_io": false, 00:11:08.533 "nvme_io_md": false, 00:11:08.533 "write_zeroes": true, 00:11:08.533 "zcopy": true, 00:11:08.533 "get_zone_info": false, 00:11:08.533 "zone_management": false, 00:11:08.533 "zone_append": false, 00:11:08.533 "compare": false, 00:11:08.533 "compare_and_write": false, 00:11:08.533 "abort": true, 00:11:08.533 "seek_hole": false, 00:11:08.533 "seek_data": false, 00:11:08.533 "copy": true, 00:11:08.533 "nvme_iov_md": false 00:11:08.533 }, 00:11:08.533 "memory_domains": [ 00:11:08.533 { 00:11:08.533 "dma_device_id": "system", 00:11:08.533 "dma_device_type": 1 00:11:08.533 }, 00:11:08.533 { 00:11:08.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.533 "dma_device_type": 2 00:11:08.533 } 00:11:08.533 ], 00:11:08.533 "driver_specific": {} 00:11:08.533 }' 00:11:08.533 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:08.533 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:08.533 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:08.533 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:08.533 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:08.533 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:08.533 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:08.791 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:08.792 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:08.792 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:08.792 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:08.792 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:08.792 10:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:09.049 [2024-07-12 10:37:44.116555] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:09.049 [2024-07-12 10:37:44.116585] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:09.049 [2024-07-12 10:37:44.116627] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.049 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:09.307 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:09.307 "name": "Existed_Raid", 00:11:09.307 "uuid": "394feffe-46c1-4fa1-b979-db05cecf5e7c", 00:11:09.307 "strip_size_kb": 64, 00:11:09.307 "state": "offline", 00:11:09.307 "raid_level": "raid0", 00:11:09.307 "superblock": true, 00:11:09.307 "num_base_bdevs": 2, 00:11:09.307 "num_base_bdevs_discovered": 1, 00:11:09.307 "num_base_bdevs_operational": 1, 00:11:09.307 "base_bdevs_list": [ 00:11:09.307 { 00:11:09.307 "name": null, 00:11:09.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.307 "is_configured": false, 00:11:09.307 "data_offset": 2048, 00:11:09.307 "data_size": 63488 00:11:09.307 }, 00:11:09.307 { 00:11:09.307 "name": "BaseBdev2", 00:11:09.307 "uuid": "9b2416b3-2853-4792-ac2b-67a8f4522acb", 00:11:09.307 "is_configured": true, 00:11:09.307 "data_offset": 2048, 00:11:09.307 "data_size": 63488 00:11:09.307 } 00:11:09.307 ] 00:11:09.307 }' 00:11:09.307 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:09.307 10:37:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:09.873 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:09.873 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:09.873 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.873 10:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:10.131 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:10.131 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:10.131 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:10.390 [2024-07-12 10:37:45.462015] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:10.390 [2024-07-12 10:37:45.462071] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e5000 name Existed_Raid, state offline 00:11:10.390 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:10.390 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:10.390 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.390 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2016206 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2016206 ']' 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2016206 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2016206 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2016206' 00:11:10.648 killing process with pid 2016206 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2016206 00:11:10.648 [2024-07-12 10:37:45.796369] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:10.648 10:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2016206 00:11:10.648 [2024-07-12 10:37:45.797339] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:10.907 10:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:10.907 00:11:10.907 real 0m10.602s 00:11:10.907 user 0m18.848s 00:11:10.907 sys 0m1.978s 00:11:10.907 10:37:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:10.907 10:37:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:10.907 ************************************ 00:11:10.907 END TEST raid_state_function_test_sb 00:11:10.907 ************************************ 00:11:10.907 10:37:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:10.907 10:37:46 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:10.907 10:37:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:10.907 10:37:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:10.907 10:37:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:11.166 ************************************ 00:11:11.166 START TEST raid_superblock_test 00:11:11.166 ************************************ 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2017842 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2017842 /var/tmp/spdk-raid.sock 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2017842 ']' 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:11.166 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:11.166 10:37:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.166 [2024-07-12 10:37:46.178417] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:11:11.166 [2024-07-12 10:37:46.178504] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2017842 ] 00:11:11.166 [2024-07-12 10:37:46.310776] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:11.425 [2024-07-12 10:37:46.416351] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:11.425 [2024-07-12 10:37:46.482053] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.425 [2024-07-12 10:37:46.482095] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:11.991 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:12.249 malloc1 00:11:12.249 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:12.508 [2024-07-12 10:37:47.592592] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:12.508 [2024-07-12 10:37:47.592644] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:12.508 [2024-07-12 10:37:47.592663] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2471570 00:11:12.508 [2024-07-12 10:37:47.592675] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:12.508 [2024-07-12 10:37:47.594232] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:12.508 [2024-07-12 10:37:47.594262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:12.508 pt1 00:11:12.508 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:12.508 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:12.508 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:12.508 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:12.508 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:12.508 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:12.508 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:12.508 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:12.508 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:12.766 malloc2 00:11:12.766 10:37:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:13.026 [2024-07-12 10:37:48.090705] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:13.026 [2024-07-12 10:37:48.090752] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:13.026 [2024-07-12 10:37:48.090769] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2472970 00:11:13.026 [2024-07-12 10:37:48.090782] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:13.026 [2024-07-12 10:37:48.092208] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:13.026 [2024-07-12 10:37:48.092236] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:13.026 pt2 00:11:13.026 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:13.026 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:13.026 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:13.284 [2024-07-12 10:37:48.335373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:13.284 [2024-07-12 10:37:48.336555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:13.284 [2024-07-12 10:37:48.336696] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2615270 00:11:13.284 [2024-07-12 10:37:48.336709] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:13.284 [2024-07-12 10:37:48.336888] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x260ac10 00:11:13.284 [2024-07-12 10:37:48.337026] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2615270 00:11:13.284 [2024-07-12 10:37:48.337036] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2615270 00:11:13.284 [2024-07-12 10:37:48.337126] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.284 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:13.542 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.542 "name": "raid_bdev1", 00:11:13.542 "uuid": "5ee02a68-33a5-489e-be91-afd5b970edf6", 00:11:13.542 "strip_size_kb": 64, 00:11:13.542 "state": "online", 00:11:13.542 "raid_level": "raid0", 00:11:13.542 "superblock": true, 00:11:13.542 "num_base_bdevs": 2, 00:11:13.542 "num_base_bdevs_discovered": 2, 00:11:13.542 "num_base_bdevs_operational": 2, 00:11:13.542 "base_bdevs_list": [ 00:11:13.542 { 00:11:13.542 "name": "pt1", 00:11:13.542 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:13.542 "is_configured": true, 00:11:13.542 "data_offset": 2048, 00:11:13.542 "data_size": 63488 00:11:13.542 }, 00:11:13.542 { 00:11:13.542 "name": "pt2", 00:11:13.542 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:13.542 "is_configured": true, 00:11:13.542 "data_offset": 2048, 00:11:13.542 "data_size": 63488 00:11:13.542 } 00:11:13.542 ] 00:11:13.542 }' 00:11:13.542 10:37:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.542 10:37:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.109 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:14.109 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:14.109 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:14.109 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:14.109 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:14.109 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:14.109 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:14.109 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:14.368 [2024-07-12 10:37:49.418454] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:14.368 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:14.368 "name": "raid_bdev1", 00:11:14.368 "aliases": [ 00:11:14.368 "5ee02a68-33a5-489e-be91-afd5b970edf6" 00:11:14.368 ], 00:11:14.368 "product_name": "Raid Volume", 00:11:14.368 "block_size": 512, 00:11:14.368 "num_blocks": 126976, 00:11:14.368 "uuid": "5ee02a68-33a5-489e-be91-afd5b970edf6", 00:11:14.368 "assigned_rate_limits": { 00:11:14.368 "rw_ios_per_sec": 0, 00:11:14.368 "rw_mbytes_per_sec": 0, 00:11:14.368 "r_mbytes_per_sec": 0, 00:11:14.368 "w_mbytes_per_sec": 0 00:11:14.368 }, 00:11:14.368 "claimed": false, 00:11:14.368 "zoned": false, 00:11:14.368 "supported_io_types": { 00:11:14.368 "read": true, 00:11:14.368 "write": true, 00:11:14.368 "unmap": true, 00:11:14.368 "flush": true, 00:11:14.368 "reset": true, 00:11:14.368 "nvme_admin": false, 00:11:14.368 "nvme_io": false, 00:11:14.368 "nvme_io_md": false, 00:11:14.368 "write_zeroes": true, 00:11:14.368 "zcopy": false, 00:11:14.368 "get_zone_info": false, 00:11:14.368 "zone_management": false, 00:11:14.368 "zone_append": false, 00:11:14.368 "compare": false, 00:11:14.368 "compare_and_write": false, 00:11:14.368 "abort": false, 00:11:14.368 "seek_hole": false, 00:11:14.368 "seek_data": false, 00:11:14.368 "copy": false, 00:11:14.368 "nvme_iov_md": false 00:11:14.368 }, 00:11:14.368 "memory_domains": [ 00:11:14.368 { 00:11:14.368 "dma_device_id": "system", 00:11:14.368 "dma_device_type": 1 00:11:14.368 }, 00:11:14.368 { 00:11:14.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.368 "dma_device_type": 2 00:11:14.368 }, 00:11:14.368 { 00:11:14.368 "dma_device_id": "system", 00:11:14.368 "dma_device_type": 1 00:11:14.368 }, 00:11:14.368 { 00:11:14.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.368 "dma_device_type": 2 00:11:14.368 } 00:11:14.368 ], 00:11:14.368 "driver_specific": { 00:11:14.368 "raid": { 00:11:14.368 "uuid": "5ee02a68-33a5-489e-be91-afd5b970edf6", 00:11:14.368 "strip_size_kb": 64, 00:11:14.368 "state": "online", 00:11:14.368 "raid_level": "raid0", 00:11:14.368 "superblock": true, 00:11:14.368 "num_base_bdevs": 2, 00:11:14.368 "num_base_bdevs_discovered": 2, 00:11:14.368 "num_base_bdevs_operational": 2, 00:11:14.368 "base_bdevs_list": [ 00:11:14.368 { 00:11:14.368 "name": "pt1", 00:11:14.368 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:14.368 "is_configured": true, 00:11:14.368 "data_offset": 2048, 00:11:14.368 "data_size": 63488 00:11:14.368 }, 00:11:14.368 { 00:11:14.368 "name": "pt2", 00:11:14.368 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:14.368 "is_configured": true, 00:11:14.368 "data_offset": 2048, 00:11:14.368 "data_size": 63488 00:11:14.368 } 00:11:14.368 ] 00:11:14.368 } 00:11:14.368 } 00:11:14.368 }' 00:11:14.368 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:14.368 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:14.369 pt2' 00:11:14.369 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:14.369 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:14.369 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:14.627 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:14.627 "name": "pt1", 00:11:14.627 "aliases": [ 00:11:14.627 "00000000-0000-0000-0000-000000000001" 00:11:14.627 ], 00:11:14.627 "product_name": "passthru", 00:11:14.627 "block_size": 512, 00:11:14.627 "num_blocks": 65536, 00:11:14.627 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:14.627 "assigned_rate_limits": { 00:11:14.627 "rw_ios_per_sec": 0, 00:11:14.627 "rw_mbytes_per_sec": 0, 00:11:14.627 "r_mbytes_per_sec": 0, 00:11:14.627 "w_mbytes_per_sec": 0 00:11:14.627 }, 00:11:14.627 "claimed": true, 00:11:14.627 "claim_type": "exclusive_write", 00:11:14.627 "zoned": false, 00:11:14.627 "supported_io_types": { 00:11:14.627 "read": true, 00:11:14.627 "write": true, 00:11:14.627 "unmap": true, 00:11:14.627 "flush": true, 00:11:14.627 "reset": true, 00:11:14.627 "nvme_admin": false, 00:11:14.627 "nvme_io": false, 00:11:14.627 "nvme_io_md": false, 00:11:14.627 "write_zeroes": true, 00:11:14.627 "zcopy": true, 00:11:14.627 "get_zone_info": false, 00:11:14.627 "zone_management": false, 00:11:14.627 "zone_append": false, 00:11:14.627 "compare": false, 00:11:14.627 "compare_and_write": false, 00:11:14.627 "abort": true, 00:11:14.627 "seek_hole": false, 00:11:14.627 "seek_data": false, 00:11:14.627 "copy": true, 00:11:14.627 "nvme_iov_md": false 00:11:14.627 }, 00:11:14.627 "memory_domains": [ 00:11:14.627 { 00:11:14.627 "dma_device_id": "system", 00:11:14.627 "dma_device_type": 1 00:11:14.627 }, 00:11:14.627 { 00:11:14.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.627 "dma_device_type": 2 00:11:14.627 } 00:11:14.627 ], 00:11:14.627 "driver_specific": { 00:11:14.627 "passthru": { 00:11:14.627 "name": "pt1", 00:11:14.627 "base_bdev_name": "malloc1" 00:11:14.627 } 00:11:14.627 } 00:11:14.627 }' 00:11:14.627 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.627 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.886 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:14.886 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.886 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.886 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:14.886 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.886 10:37:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.886 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:14.886 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.886 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:15.144 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:15.144 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:15.144 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:15.145 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:15.145 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:15.145 "name": "pt2", 00:11:15.145 "aliases": [ 00:11:15.145 "00000000-0000-0000-0000-000000000002" 00:11:15.145 ], 00:11:15.145 "product_name": "passthru", 00:11:15.145 "block_size": 512, 00:11:15.145 "num_blocks": 65536, 00:11:15.145 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:15.145 "assigned_rate_limits": { 00:11:15.145 "rw_ios_per_sec": 0, 00:11:15.145 "rw_mbytes_per_sec": 0, 00:11:15.145 "r_mbytes_per_sec": 0, 00:11:15.145 "w_mbytes_per_sec": 0 00:11:15.145 }, 00:11:15.145 "claimed": true, 00:11:15.145 "claim_type": "exclusive_write", 00:11:15.145 "zoned": false, 00:11:15.145 "supported_io_types": { 00:11:15.145 "read": true, 00:11:15.145 "write": true, 00:11:15.145 "unmap": true, 00:11:15.145 "flush": true, 00:11:15.145 "reset": true, 00:11:15.145 "nvme_admin": false, 00:11:15.145 "nvme_io": false, 00:11:15.145 "nvme_io_md": false, 00:11:15.145 "write_zeroes": true, 00:11:15.145 "zcopy": true, 00:11:15.145 "get_zone_info": false, 00:11:15.145 "zone_management": false, 00:11:15.145 "zone_append": false, 00:11:15.145 "compare": false, 00:11:15.145 "compare_and_write": false, 00:11:15.145 "abort": true, 00:11:15.145 "seek_hole": false, 00:11:15.145 "seek_data": false, 00:11:15.145 "copy": true, 00:11:15.145 "nvme_iov_md": false 00:11:15.145 }, 00:11:15.145 "memory_domains": [ 00:11:15.145 { 00:11:15.145 "dma_device_id": "system", 00:11:15.145 "dma_device_type": 1 00:11:15.145 }, 00:11:15.145 { 00:11:15.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:15.145 "dma_device_type": 2 00:11:15.145 } 00:11:15.145 ], 00:11:15.145 "driver_specific": { 00:11:15.145 "passthru": { 00:11:15.145 "name": "pt2", 00:11:15.145 "base_bdev_name": "malloc2" 00:11:15.145 } 00:11:15.145 } 00:11:15.145 }' 00:11:15.145 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:15.145 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:15.403 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:15.403 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:15.403 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:15.403 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:15.403 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:15.403 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:15.403 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:15.403 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:15.661 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:15.661 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:15.661 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:15.661 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:15.969 [2024-07-12 10:37:50.866288] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:15.969 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=5ee02a68-33a5-489e-be91-afd5b970edf6 00:11:15.969 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 5ee02a68-33a5-489e-be91-afd5b970edf6 ']' 00:11:15.969 10:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:15.969 [2024-07-12 10:37:51.114729] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:15.969 [2024-07-12 10:37:51.114755] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:15.969 [2024-07-12 10:37:51.114816] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:15.970 [2024-07-12 10:37:51.114861] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:15.970 [2024-07-12 10:37:51.114873] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2615270 name raid_bdev1, state offline 00:11:15.970 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.970 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:16.234 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:16.234 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:16.234 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:16.234 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:16.491 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:16.491 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:16.749 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:16.749 10:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:17.008 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:17.267 [2024-07-12 10:37:52.337914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:17.267 [2024-07-12 10:37:52.339309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:17.267 [2024-07-12 10:37:52.339367] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:17.267 [2024-07-12 10:37:52.339408] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:17.267 [2024-07-12 10:37:52.339427] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:17.267 [2024-07-12 10:37:52.339436] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2614ff0 name raid_bdev1, state configuring 00:11:17.267 request: 00:11:17.267 { 00:11:17.267 "name": "raid_bdev1", 00:11:17.267 "raid_level": "raid0", 00:11:17.267 "base_bdevs": [ 00:11:17.267 "malloc1", 00:11:17.267 "malloc2" 00:11:17.267 ], 00:11:17.267 "strip_size_kb": 64, 00:11:17.267 "superblock": false, 00:11:17.267 "method": "bdev_raid_create", 00:11:17.267 "req_id": 1 00:11:17.267 } 00:11:17.267 Got JSON-RPC error response 00:11:17.267 response: 00:11:17.267 { 00:11:17.267 "code": -17, 00:11:17.267 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:17.267 } 00:11:17.267 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:17.267 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:17.267 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:17.267 10:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:17.267 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.267 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:17.525 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:17.525 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:17.525 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:17.784 [2024-07-12 10:37:52.815097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:17.784 [2024-07-12 10:37:52.815142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:17.784 [2024-07-12 10:37:52.815165] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24717a0 00:11:17.784 [2024-07-12 10:37:52.815178] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:17.784 [2024-07-12 10:37:52.816775] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:17.784 [2024-07-12 10:37:52.816805] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:17.784 [2024-07-12 10:37:52.816872] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:17.784 [2024-07-12 10:37:52.816897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:17.784 pt1 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.784 10:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:18.050 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.050 "name": "raid_bdev1", 00:11:18.050 "uuid": "5ee02a68-33a5-489e-be91-afd5b970edf6", 00:11:18.050 "strip_size_kb": 64, 00:11:18.050 "state": "configuring", 00:11:18.050 "raid_level": "raid0", 00:11:18.050 "superblock": true, 00:11:18.050 "num_base_bdevs": 2, 00:11:18.050 "num_base_bdevs_discovered": 1, 00:11:18.050 "num_base_bdevs_operational": 2, 00:11:18.050 "base_bdevs_list": [ 00:11:18.050 { 00:11:18.050 "name": "pt1", 00:11:18.050 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:18.050 "is_configured": true, 00:11:18.050 "data_offset": 2048, 00:11:18.050 "data_size": 63488 00:11:18.050 }, 00:11:18.050 { 00:11:18.050 "name": null, 00:11:18.050 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:18.050 "is_configured": false, 00:11:18.050 "data_offset": 2048, 00:11:18.050 "data_size": 63488 00:11:18.050 } 00:11:18.050 ] 00:11:18.050 }' 00:11:18.050 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.050 10:37:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:18.617 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:18.617 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:18.617 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:18.617 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:18.876 [2024-07-12 10:37:53.922044] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:18.876 [2024-07-12 10:37:53.922096] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:18.876 [2024-07-12 10:37:53.922121] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x260b820 00:11:18.876 [2024-07-12 10:37:53.922134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:18.876 [2024-07-12 10:37:53.922498] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:18.876 [2024-07-12 10:37:53.922520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:18.876 [2024-07-12 10:37:53.922584] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:18.876 [2024-07-12 10:37:53.922603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:18.876 [2024-07-12 10:37:53.922700] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2467ec0 00:11:18.876 [2024-07-12 10:37:53.922711] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:18.876 [2024-07-12 10:37:53.922880] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x246a530 00:11:18.876 [2024-07-12 10:37:53.923000] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2467ec0 00:11:18.876 [2024-07-12 10:37:53.923010] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2467ec0 00:11:18.876 [2024-07-12 10:37:53.923107] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:18.876 pt2 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.876 10:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:19.134 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:19.134 "name": "raid_bdev1", 00:11:19.134 "uuid": "5ee02a68-33a5-489e-be91-afd5b970edf6", 00:11:19.134 "strip_size_kb": 64, 00:11:19.134 "state": "online", 00:11:19.134 "raid_level": "raid0", 00:11:19.134 "superblock": true, 00:11:19.134 "num_base_bdevs": 2, 00:11:19.134 "num_base_bdevs_discovered": 2, 00:11:19.134 "num_base_bdevs_operational": 2, 00:11:19.134 "base_bdevs_list": [ 00:11:19.134 { 00:11:19.134 "name": "pt1", 00:11:19.134 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:19.134 "is_configured": true, 00:11:19.134 "data_offset": 2048, 00:11:19.134 "data_size": 63488 00:11:19.134 }, 00:11:19.134 { 00:11:19.134 "name": "pt2", 00:11:19.134 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:19.134 "is_configured": true, 00:11:19.134 "data_offset": 2048, 00:11:19.134 "data_size": 63488 00:11:19.134 } 00:11:19.134 ] 00:11:19.134 }' 00:11:19.134 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:19.134 10:37:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.699 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:19.699 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:19.699 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:19.699 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:19.699 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:19.699 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:19.699 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:19.699 10:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:19.957 [2024-07-12 10:37:55.029291] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:19.957 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:19.957 "name": "raid_bdev1", 00:11:19.957 "aliases": [ 00:11:19.957 "5ee02a68-33a5-489e-be91-afd5b970edf6" 00:11:19.957 ], 00:11:19.957 "product_name": "Raid Volume", 00:11:19.957 "block_size": 512, 00:11:19.957 "num_blocks": 126976, 00:11:19.957 "uuid": "5ee02a68-33a5-489e-be91-afd5b970edf6", 00:11:19.957 "assigned_rate_limits": { 00:11:19.957 "rw_ios_per_sec": 0, 00:11:19.957 "rw_mbytes_per_sec": 0, 00:11:19.957 "r_mbytes_per_sec": 0, 00:11:19.957 "w_mbytes_per_sec": 0 00:11:19.957 }, 00:11:19.957 "claimed": false, 00:11:19.957 "zoned": false, 00:11:19.957 "supported_io_types": { 00:11:19.957 "read": true, 00:11:19.957 "write": true, 00:11:19.957 "unmap": true, 00:11:19.957 "flush": true, 00:11:19.957 "reset": true, 00:11:19.957 "nvme_admin": false, 00:11:19.957 "nvme_io": false, 00:11:19.957 "nvme_io_md": false, 00:11:19.957 "write_zeroes": true, 00:11:19.957 "zcopy": false, 00:11:19.957 "get_zone_info": false, 00:11:19.957 "zone_management": false, 00:11:19.957 "zone_append": false, 00:11:19.957 "compare": false, 00:11:19.958 "compare_and_write": false, 00:11:19.958 "abort": false, 00:11:19.958 "seek_hole": false, 00:11:19.958 "seek_data": false, 00:11:19.958 "copy": false, 00:11:19.958 "nvme_iov_md": false 00:11:19.958 }, 00:11:19.958 "memory_domains": [ 00:11:19.958 { 00:11:19.958 "dma_device_id": "system", 00:11:19.958 "dma_device_type": 1 00:11:19.958 }, 00:11:19.958 { 00:11:19.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.958 "dma_device_type": 2 00:11:19.958 }, 00:11:19.958 { 00:11:19.958 "dma_device_id": "system", 00:11:19.958 "dma_device_type": 1 00:11:19.958 }, 00:11:19.958 { 00:11:19.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.958 "dma_device_type": 2 00:11:19.958 } 00:11:19.958 ], 00:11:19.958 "driver_specific": { 00:11:19.958 "raid": { 00:11:19.958 "uuid": "5ee02a68-33a5-489e-be91-afd5b970edf6", 00:11:19.958 "strip_size_kb": 64, 00:11:19.958 "state": "online", 00:11:19.958 "raid_level": "raid0", 00:11:19.958 "superblock": true, 00:11:19.958 "num_base_bdevs": 2, 00:11:19.958 "num_base_bdevs_discovered": 2, 00:11:19.958 "num_base_bdevs_operational": 2, 00:11:19.958 "base_bdevs_list": [ 00:11:19.958 { 00:11:19.958 "name": "pt1", 00:11:19.958 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:19.958 "is_configured": true, 00:11:19.958 "data_offset": 2048, 00:11:19.958 "data_size": 63488 00:11:19.958 }, 00:11:19.958 { 00:11:19.958 "name": "pt2", 00:11:19.958 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:19.958 "is_configured": true, 00:11:19.958 "data_offset": 2048, 00:11:19.958 "data_size": 63488 00:11:19.958 } 00:11:19.958 ] 00:11:19.958 } 00:11:19.958 } 00:11:19.958 }' 00:11:19.958 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:19.958 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:19.958 pt2' 00:11:19.958 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:19.958 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:19.958 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:20.216 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:20.216 "name": "pt1", 00:11:20.216 "aliases": [ 00:11:20.216 "00000000-0000-0000-0000-000000000001" 00:11:20.216 ], 00:11:20.216 "product_name": "passthru", 00:11:20.216 "block_size": 512, 00:11:20.216 "num_blocks": 65536, 00:11:20.216 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:20.216 "assigned_rate_limits": { 00:11:20.216 "rw_ios_per_sec": 0, 00:11:20.216 "rw_mbytes_per_sec": 0, 00:11:20.216 "r_mbytes_per_sec": 0, 00:11:20.216 "w_mbytes_per_sec": 0 00:11:20.216 }, 00:11:20.216 "claimed": true, 00:11:20.216 "claim_type": "exclusive_write", 00:11:20.216 "zoned": false, 00:11:20.216 "supported_io_types": { 00:11:20.216 "read": true, 00:11:20.216 "write": true, 00:11:20.216 "unmap": true, 00:11:20.216 "flush": true, 00:11:20.216 "reset": true, 00:11:20.216 "nvme_admin": false, 00:11:20.216 "nvme_io": false, 00:11:20.216 "nvme_io_md": false, 00:11:20.216 "write_zeroes": true, 00:11:20.216 "zcopy": true, 00:11:20.216 "get_zone_info": false, 00:11:20.216 "zone_management": false, 00:11:20.216 "zone_append": false, 00:11:20.216 "compare": false, 00:11:20.216 "compare_and_write": false, 00:11:20.216 "abort": true, 00:11:20.216 "seek_hole": false, 00:11:20.216 "seek_data": false, 00:11:20.216 "copy": true, 00:11:20.216 "nvme_iov_md": false 00:11:20.216 }, 00:11:20.216 "memory_domains": [ 00:11:20.216 { 00:11:20.216 "dma_device_id": "system", 00:11:20.216 "dma_device_type": 1 00:11:20.216 }, 00:11:20.216 { 00:11:20.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.216 "dma_device_type": 2 00:11:20.216 } 00:11:20.216 ], 00:11:20.216 "driver_specific": { 00:11:20.216 "passthru": { 00:11:20.216 "name": "pt1", 00:11:20.216 "base_bdev_name": "malloc1" 00:11:20.216 } 00:11:20.216 } 00:11:20.216 }' 00:11:20.216 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.216 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:20.474 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:20.732 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:20.732 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:20.732 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:20.732 "name": "pt2", 00:11:20.732 "aliases": [ 00:11:20.732 "00000000-0000-0000-0000-000000000002" 00:11:20.732 ], 00:11:20.732 "product_name": "passthru", 00:11:20.732 "block_size": 512, 00:11:20.732 "num_blocks": 65536, 00:11:20.732 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:20.732 "assigned_rate_limits": { 00:11:20.732 "rw_ios_per_sec": 0, 00:11:20.732 "rw_mbytes_per_sec": 0, 00:11:20.732 "r_mbytes_per_sec": 0, 00:11:20.732 "w_mbytes_per_sec": 0 00:11:20.732 }, 00:11:20.732 "claimed": true, 00:11:20.732 "claim_type": "exclusive_write", 00:11:20.732 "zoned": false, 00:11:20.732 "supported_io_types": { 00:11:20.732 "read": true, 00:11:20.732 "write": true, 00:11:20.732 "unmap": true, 00:11:20.732 "flush": true, 00:11:20.732 "reset": true, 00:11:20.732 "nvme_admin": false, 00:11:20.732 "nvme_io": false, 00:11:20.732 "nvme_io_md": false, 00:11:20.732 "write_zeroes": true, 00:11:20.732 "zcopy": true, 00:11:20.732 "get_zone_info": false, 00:11:20.732 "zone_management": false, 00:11:20.732 "zone_append": false, 00:11:20.732 "compare": false, 00:11:20.732 "compare_and_write": false, 00:11:20.732 "abort": true, 00:11:20.732 "seek_hole": false, 00:11:20.732 "seek_data": false, 00:11:20.732 "copy": true, 00:11:20.732 "nvme_iov_md": false 00:11:20.732 }, 00:11:20.732 "memory_domains": [ 00:11:20.732 { 00:11:20.732 "dma_device_id": "system", 00:11:20.732 "dma_device_type": 1 00:11:20.732 }, 00:11:20.732 { 00:11:20.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.732 "dma_device_type": 2 00:11:20.732 } 00:11:20.732 ], 00:11:20.732 "driver_specific": { 00:11:20.732 "passthru": { 00:11:20.732 "name": "pt2", 00:11:20.732 "base_bdev_name": "malloc2" 00:11:20.732 } 00:11:20.732 } 00:11:20.732 }' 00:11:20.732 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.990 10:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.990 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:20.990 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:20.990 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:20.990 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:20.990 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:20.990 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:20.990 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:20.990 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:20.990 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.248 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:21.248 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:21.248 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:21.507 [2024-07-12 10:37:56.457112] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 5ee02a68-33a5-489e-be91-afd5b970edf6 '!=' 5ee02a68-33a5-489e-be91-afd5b970edf6 ']' 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2017842 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2017842 ']' 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2017842 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2017842 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2017842' 00:11:21.507 killing process with pid 2017842 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2017842 00:11:21.507 [2024-07-12 10:37:56.530179] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:21.507 [2024-07-12 10:37:56.530241] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:21.507 [2024-07-12 10:37:56.530283] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:21.507 [2024-07-12 10:37:56.530305] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2467ec0 name raid_bdev1, state offline 00:11:21.507 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2017842 00:11:21.507 [2024-07-12 10:37:56.548705] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:21.766 10:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:21.766 00:11:21.766 real 0m10.662s 00:11:21.766 user 0m19.007s 00:11:21.766 sys 0m2.009s 00:11:21.766 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:21.766 10:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:21.766 ************************************ 00:11:21.766 END TEST raid_superblock_test 00:11:21.766 ************************************ 00:11:21.766 10:37:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:21.766 10:37:56 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:11:21.766 10:37:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:21.766 10:37:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:21.766 10:37:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:21.766 ************************************ 00:11:21.766 START TEST raid_read_error_test 00:11:21.766 ************************************ 00:11:21.766 10:37:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:11:21.766 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:21.766 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:21.766 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.k7IXSsvbMT 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2019476 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2019476 /var/tmp/spdk-raid.sock 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2019476 ']' 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:21.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:21.767 10:37:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:21.767 [2024-07-12 10:37:56.940033] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:11:21.767 [2024-07-12 10:37:56.940108] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2019476 ] 00:11:22.025 [2024-07-12 10:37:57.070268] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:22.025 [2024-07-12 10:37:57.172062] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:22.283 [2024-07-12 10:37:57.240429] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:22.283 [2024-07-12 10:37:57.240467] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:22.848 10:37:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:22.848 10:37:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:22.848 10:37:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:22.848 10:37:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:22.848 BaseBdev1_malloc 00:11:23.106 10:37:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:23.106 true 00:11:23.106 10:37:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:23.364 [2024-07-12 10:37:58.522497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:23.364 [2024-07-12 10:37:58.522546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:23.364 [2024-07-12 10:37:58.522566] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28620d0 00:11:23.364 [2024-07-12 10:37:58.522578] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:23.364 [2024-07-12 10:37:58.524319] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:23.364 [2024-07-12 10:37:58.524352] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:23.364 BaseBdev1 00:11:23.364 10:37:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:23.364 10:37:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:23.622 BaseBdev2_malloc 00:11:23.622 10:37:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:23.880 true 00:11:23.880 10:37:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:24.139 [2024-07-12 10:37:59.124725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:24.139 [2024-07-12 10:37:59.124770] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:24.139 [2024-07-12 10:37:59.124790] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2866910 00:11:24.139 [2024-07-12 10:37:59.124803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:24.139 [2024-07-12 10:37:59.126211] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:24.139 [2024-07-12 10:37:59.126241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:24.139 BaseBdev2 00:11:24.139 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:24.399 [2024-07-12 10:37:59.369404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:24.399 [2024-07-12 10:37:59.370687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:24.399 [2024-07-12 10:37:59.370876] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2868320 00:11:24.399 [2024-07-12 10:37:59.370890] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:24.399 [2024-07-12 10:37:59.371087] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2867270 00:11:24.399 [2024-07-12 10:37:59.371230] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2868320 00:11:24.399 [2024-07-12 10:37:59.371241] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2868320 00:11:24.399 [2024-07-12 10:37:59.371343] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.399 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:24.658 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:24.658 "name": "raid_bdev1", 00:11:24.658 "uuid": "8355f019-50e2-407f-92fb-c6623fbec650", 00:11:24.658 "strip_size_kb": 64, 00:11:24.658 "state": "online", 00:11:24.658 "raid_level": "raid0", 00:11:24.658 "superblock": true, 00:11:24.658 "num_base_bdevs": 2, 00:11:24.658 "num_base_bdevs_discovered": 2, 00:11:24.658 "num_base_bdevs_operational": 2, 00:11:24.658 "base_bdevs_list": [ 00:11:24.658 { 00:11:24.658 "name": "BaseBdev1", 00:11:24.658 "uuid": "d8f810f9-5f25-533a-a42a-c05e3f124510", 00:11:24.658 "is_configured": true, 00:11:24.658 "data_offset": 2048, 00:11:24.658 "data_size": 63488 00:11:24.658 }, 00:11:24.658 { 00:11:24.658 "name": "BaseBdev2", 00:11:24.658 "uuid": "5fe72eff-32a8-52dd-b770-a9582170c11a", 00:11:24.658 "is_configured": true, 00:11:24.658 "data_offset": 2048, 00:11:24.658 "data_size": 63488 00:11:24.658 } 00:11:24.658 ] 00:11:24.658 }' 00:11:24.658 10:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:24.658 10:37:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:25.224 10:38:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:25.224 10:38:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:25.224 [2024-07-12 10:38:00.352310] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28639b0 00:11:26.164 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.429 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:26.687 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:26.687 "name": "raid_bdev1", 00:11:26.687 "uuid": "8355f019-50e2-407f-92fb-c6623fbec650", 00:11:26.687 "strip_size_kb": 64, 00:11:26.687 "state": "online", 00:11:26.687 "raid_level": "raid0", 00:11:26.687 "superblock": true, 00:11:26.687 "num_base_bdevs": 2, 00:11:26.687 "num_base_bdevs_discovered": 2, 00:11:26.687 "num_base_bdevs_operational": 2, 00:11:26.687 "base_bdevs_list": [ 00:11:26.687 { 00:11:26.687 "name": "BaseBdev1", 00:11:26.687 "uuid": "d8f810f9-5f25-533a-a42a-c05e3f124510", 00:11:26.687 "is_configured": true, 00:11:26.687 "data_offset": 2048, 00:11:26.687 "data_size": 63488 00:11:26.687 }, 00:11:26.687 { 00:11:26.687 "name": "BaseBdev2", 00:11:26.687 "uuid": "5fe72eff-32a8-52dd-b770-a9582170c11a", 00:11:26.687 "is_configured": true, 00:11:26.687 "data_offset": 2048, 00:11:26.687 "data_size": 63488 00:11:26.687 } 00:11:26.687 ] 00:11:26.687 }' 00:11:26.687 10:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:26.687 10:38:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.252 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:27.510 [2024-07-12 10:38:02.586687] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:27.510 [2024-07-12 10:38:02.586729] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:27.510 [2024-07-12 10:38:02.589900] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:27.510 [2024-07-12 10:38:02.589933] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:27.510 [2024-07-12 10:38:02.589961] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:27.510 [2024-07-12 10:38:02.589972] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2868320 name raid_bdev1, state offline 00:11:27.510 0 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2019476 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2019476 ']' 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2019476 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2019476 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2019476' 00:11:27.510 killing process with pid 2019476 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2019476 00:11:27.510 [2024-07-12 10:38:02.653839] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:27.510 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2019476 00:11:27.510 [2024-07-12 10:38:02.664375] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.k7IXSsvbMT 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:11:27.768 00:11:27.768 real 0m6.040s 00:11:27.768 user 0m9.367s 00:11:27.768 sys 0m1.077s 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:27.768 10:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.768 ************************************ 00:11:27.768 END TEST raid_read_error_test 00:11:27.768 ************************************ 00:11:27.768 10:38:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:27.768 10:38:02 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:11:27.768 10:38:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:27.768 10:38:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:27.768 10:38:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:28.026 ************************************ 00:11:28.026 START TEST raid_write_error_test 00:11:28.026 ************************************ 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:28.026 10:38:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.XUn87M5yR1 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2020446 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2020446 /var/tmp/spdk-raid.sock 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2020446 ']' 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:28.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:28.026 10:38:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:28.026 [2024-07-12 10:38:03.067188] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:11:28.026 [2024-07-12 10:38:03.067262] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2020446 ] 00:11:28.026 [2024-07-12 10:38:03.199638] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:28.284 [2024-07-12 10:38:03.303233] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:28.284 [2024-07-12 10:38:03.369368] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:28.284 [2024-07-12 10:38:03.369408] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:28.849 10:38:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:28.849 10:38:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:28.849 10:38:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:28.849 10:38:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:29.106 BaseBdev1_malloc 00:11:29.106 10:38:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:29.365 true 00:11:29.365 10:38:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:29.624 [2024-07-12 10:38:04.643891] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:29.624 [2024-07-12 10:38:04.643938] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:29.624 [2024-07-12 10:38:04.643958] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd10d0 00:11:29.624 [2024-07-12 10:38:04.643971] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:29.624 [2024-07-12 10:38:04.645706] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:29.624 [2024-07-12 10:38:04.645737] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:29.624 BaseBdev1 00:11:29.624 10:38:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:29.624 10:38:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:29.882 BaseBdev2_malloc 00:11:29.882 10:38:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:30.154 true 00:11:30.154 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:30.425 [2024-07-12 10:38:05.358315] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:30.425 [2024-07-12 10:38:05.358363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:30.425 [2024-07-12 10:38:05.358383] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd5910 00:11:30.425 [2024-07-12 10:38:05.358396] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:30.425 [2024-07-12 10:38:05.359803] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:30.425 [2024-07-12 10:38:05.359832] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:30.425 BaseBdev2 00:11:30.425 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:30.425 [2024-07-12 10:38:05.594983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:30.425 [2024-07-12 10:38:05.596278] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:30.425 [2024-07-12 10:38:05.596466] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcd7320 00:11:30.425 [2024-07-12 10:38:05.596479] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:30.425 [2024-07-12 10:38:05.596686] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcd6270 00:11:30.425 [2024-07-12 10:38:05.596832] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcd7320 00:11:30.425 [2024-07-12 10:38:05.596842] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcd7320 00:11:30.426 [2024-07-12 10:38:05.596943] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.426 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.684 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.684 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:30.684 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:30.684 "name": "raid_bdev1", 00:11:30.684 "uuid": "0bc3e262-a09f-4587-a051-525cf85cefb4", 00:11:30.684 "strip_size_kb": 64, 00:11:30.684 "state": "online", 00:11:30.684 "raid_level": "raid0", 00:11:30.684 "superblock": true, 00:11:30.684 "num_base_bdevs": 2, 00:11:30.684 "num_base_bdevs_discovered": 2, 00:11:30.684 "num_base_bdevs_operational": 2, 00:11:30.684 "base_bdevs_list": [ 00:11:30.684 { 00:11:30.684 "name": "BaseBdev1", 00:11:30.684 "uuid": "728a8f51-81e8-5ab2-90b2-b38d0ced767d", 00:11:30.684 "is_configured": true, 00:11:30.684 "data_offset": 2048, 00:11:30.684 "data_size": 63488 00:11:30.684 }, 00:11:30.684 { 00:11:30.684 "name": "BaseBdev2", 00:11:30.684 "uuid": "6520f9bd-1113-5dda-a884-5cca184d13f4", 00:11:30.684 "is_configured": true, 00:11:30.684 "data_offset": 2048, 00:11:30.684 "data_size": 63488 00:11:30.684 } 00:11:30.684 ] 00:11:30.684 }' 00:11:30.684 10:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:30.684 10:38:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.620 10:38:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:31.620 10:38:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:31.620 [2024-07-12 10:38:06.561822] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcd29b0 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:32.555 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:32.556 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:32.556 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:32.556 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.556 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:32.813 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.813 "name": "raid_bdev1", 00:11:32.813 "uuid": "0bc3e262-a09f-4587-a051-525cf85cefb4", 00:11:32.813 "strip_size_kb": 64, 00:11:32.813 "state": "online", 00:11:32.813 "raid_level": "raid0", 00:11:32.813 "superblock": true, 00:11:32.813 "num_base_bdevs": 2, 00:11:32.813 "num_base_bdevs_discovered": 2, 00:11:32.813 "num_base_bdevs_operational": 2, 00:11:32.813 "base_bdevs_list": [ 00:11:32.813 { 00:11:32.813 "name": "BaseBdev1", 00:11:32.813 "uuid": "728a8f51-81e8-5ab2-90b2-b38d0ced767d", 00:11:32.813 "is_configured": true, 00:11:32.813 "data_offset": 2048, 00:11:32.813 "data_size": 63488 00:11:32.813 }, 00:11:32.813 { 00:11:32.813 "name": "BaseBdev2", 00:11:32.813 "uuid": "6520f9bd-1113-5dda-a884-5cca184d13f4", 00:11:32.813 "is_configured": true, 00:11:32.813 "data_offset": 2048, 00:11:32.813 "data_size": 63488 00:11:32.813 } 00:11:32.813 ] 00:11:32.813 }' 00:11:32.813 10:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.813 10:38:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.378 10:38:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:33.637 [2024-07-12 10:38:08.774821] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:33.637 [2024-07-12 10:38:08.774863] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:33.637 [2024-07-12 10:38:08.778037] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:33.637 [2024-07-12 10:38:08.778075] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:33.637 [2024-07-12 10:38:08.778103] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:33.637 [2024-07-12 10:38:08.778114] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd7320 name raid_bdev1, state offline 00:11:33.637 0 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2020446 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2020446 ']' 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2020446 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2020446 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2020446' 00:11:33.637 killing process with pid 2020446 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2020446 00:11:33.637 [2024-07-12 10:38:08.825320] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:33.637 10:38:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2020446 00:11:33.895 [2024-07-12 10:38:08.837499] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.XUn87M5yR1 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:11:33.895 00:11:33.895 real 0m6.089s 00:11:33.895 user 0m9.488s 00:11:33.895 sys 0m1.060s 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:33.895 10:38:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.895 ************************************ 00:11:33.895 END TEST raid_write_error_test 00:11:33.895 ************************************ 00:11:34.154 10:38:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:34.154 10:38:09 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:34.154 10:38:09 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:34.154 10:38:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:34.154 10:38:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:34.154 10:38:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:34.154 ************************************ 00:11:34.154 START TEST raid_state_function_test 00:11:34.154 ************************************ 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2021259 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2021259' 00:11:34.154 Process raid pid: 2021259 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2021259 /var/tmp/spdk-raid.sock 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2021259 ']' 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:34.154 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:34.154 10:38:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.154 [2024-07-12 10:38:09.219933] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:11:34.154 [2024-07-12 10:38:09.219997] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:34.154 [2024-07-12 10:38:09.341310] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:34.412 [2024-07-12 10:38:09.445831] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:34.412 [2024-07-12 10:38:09.514273] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:34.412 [2024-07-12 10:38:09.514308] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:34.978 10:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:34.978 10:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:34.978 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:35.545 [2024-07-12 10:38:10.634553] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:35.545 [2024-07-12 10:38:10.634598] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:35.545 [2024-07-12 10:38:10.634609] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:35.545 [2024-07-12 10:38:10.634621] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.545 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:35.803 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:35.803 "name": "Existed_Raid", 00:11:35.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:35.803 "strip_size_kb": 64, 00:11:35.803 "state": "configuring", 00:11:35.803 "raid_level": "concat", 00:11:35.803 "superblock": false, 00:11:35.803 "num_base_bdevs": 2, 00:11:35.803 "num_base_bdevs_discovered": 0, 00:11:35.803 "num_base_bdevs_operational": 2, 00:11:35.803 "base_bdevs_list": [ 00:11:35.803 { 00:11:35.803 "name": "BaseBdev1", 00:11:35.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:35.803 "is_configured": false, 00:11:35.803 "data_offset": 0, 00:11:35.804 "data_size": 0 00:11:35.804 }, 00:11:35.804 { 00:11:35.804 "name": "BaseBdev2", 00:11:35.804 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:35.804 "is_configured": false, 00:11:35.804 "data_offset": 0, 00:11:35.804 "data_size": 0 00:11:35.804 } 00:11:35.804 ] 00:11:35.804 }' 00:11:35.804 10:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:35.804 10:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.370 10:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:36.629 [2024-07-12 10:38:11.713270] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:36.629 [2024-07-12 10:38:11.713303] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101da80 name Existed_Raid, state configuring 00:11:36.629 10:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:36.887 [2024-07-12 10:38:11.957931] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:36.887 [2024-07-12 10:38:11.957961] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:36.887 [2024-07-12 10:38:11.957971] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:36.887 [2024-07-12 10:38:11.957983] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:36.887 10:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:37.145 [2024-07-12 10:38:12.208573] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:37.145 BaseBdev1 00:11:37.145 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:37.145 10:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:37.145 10:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:37.145 10:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:37.145 10:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:37.145 10:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:37.145 10:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:37.404 10:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:37.664 [ 00:11:37.664 { 00:11:37.664 "name": "BaseBdev1", 00:11:37.664 "aliases": [ 00:11:37.664 "bf083a4f-71c1-4f0d-a4f2-c5c130ec5868" 00:11:37.664 ], 00:11:37.664 "product_name": "Malloc disk", 00:11:37.664 "block_size": 512, 00:11:37.664 "num_blocks": 65536, 00:11:37.664 "uuid": "bf083a4f-71c1-4f0d-a4f2-c5c130ec5868", 00:11:37.664 "assigned_rate_limits": { 00:11:37.664 "rw_ios_per_sec": 0, 00:11:37.664 "rw_mbytes_per_sec": 0, 00:11:37.664 "r_mbytes_per_sec": 0, 00:11:37.664 "w_mbytes_per_sec": 0 00:11:37.664 }, 00:11:37.664 "claimed": true, 00:11:37.664 "claim_type": "exclusive_write", 00:11:37.664 "zoned": false, 00:11:37.664 "supported_io_types": { 00:11:37.664 "read": true, 00:11:37.664 "write": true, 00:11:37.664 "unmap": true, 00:11:37.664 "flush": true, 00:11:37.664 "reset": true, 00:11:37.664 "nvme_admin": false, 00:11:37.664 "nvme_io": false, 00:11:37.664 "nvme_io_md": false, 00:11:37.664 "write_zeroes": true, 00:11:37.664 "zcopy": true, 00:11:37.664 "get_zone_info": false, 00:11:37.664 "zone_management": false, 00:11:37.664 "zone_append": false, 00:11:37.664 "compare": false, 00:11:37.664 "compare_and_write": false, 00:11:37.664 "abort": true, 00:11:37.664 "seek_hole": false, 00:11:37.664 "seek_data": false, 00:11:37.664 "copy": true, 00:11:37.664 "nvme_iov_md": false 00:11:37.664 }, 00:11:37.664 "memory_domains": [ 00:11:37.664 { 00:11:37.664 "dma_device_id": "system", 00:11:37.664 "dma_device_type": 1 00:11:37.664 }, 00:11:37.664 { 00:11:37.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.664 "dma_device_type": 2 00:11:37.664 } 00:11:37.664 ], 00:11:37.664 "driver_specific": {} 00:11:37.664 } 00:11:37.664 ] 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.664 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.923 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.923 "name": "Existed_Raid", 00:11:37.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.923 "strip_size_kb": 64, 00:11:37.923 "state": "configuring", 00:11:37.923 "raid_level": "concat", 00:11:37.923 "superblock": false, 00:11:37.923 "num_base_bdevs": 2, 00:11:37.923 "num_base_bdevs_discovered": 1, 00:11:37.923 "num_base_bdevs_operational": 2, 00:11:37.923 "base_bdevs_list": [ 00:11:37.923 { 00:11:37.923 "name": "BaseBdev1", 00:11:37.923 "uuid": "bf083a4f-71c1-4f0d-a4f2-c5c130ec5868", 00:11:37.923 "is_configured": true, 00:11:37.923 "data_offset": 0, 00:11:37.923 "data_size": 65536 00:11:37.923 }, 00:11:37.923 { 00:11:37.923 "name": "BaseBdev2", 00:11:37.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.923 "is_configured": false, 00:11:37.923 "data_offset": 0, 00:11:37.923 "data_size": 0 00:11:37.923 } 00:11:37.923 ] 00:11:37.923 }' 00:11:37.923 10:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.923 10:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.488 10:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:38.744 [2024-07-12 10:38:13.832976] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:38.744 [2024-07-12 10:38:13.833013] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101d350 name Existed_Raid, state configuring 00:11:38.744 10:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:39.001 [2024-07-12 10:38:14.081658] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:39.001 [2024-07-12 10:38:14.083137] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:39.001 [2024-07-12 10:38:14.083169] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.001 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:39.258 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.258 "name": "Existed_Raid", 00:11:39.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.258 "strip_size_kb": 64, 00:11:39.258 "state": "configuring", 00:11:39.258 "raid_level": "concat", 00:11:39.258 "superblock": false, 00:11:39.258 "num_base_bdevs": 2, 00:11:39.258 "num_base_bdevs_discovered": 1, 00:11:39.258 "num_base_bdevs_operational": 2, 00:11:39.258 "base_bdevs_list": [ 00:11:39.258 { 00:11:39.258 "name": "BaseBdev1", 00:11:39.258 "uuid": "bf083a4f-71c1-4f0d-a4f2-c5c130ec5868", 00:11:39.258 "is_configured": true, 00:11:39.258 "data_offset": 0, 00:11:39.258 "data_size": 65536 00:11:39.258 }, 00:11:39.258 { 00:11:39.258 "name": "BaseBdev2", 00:11:39.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.258 "is_configured": false, 00:11:39.258 "data_offset": 0, 00:11:39.258 "data_size": 0 00:11:39.258 } 00:11:39.258 ] 00:11:39.258 }' 00:11:39.258 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.258 10:38:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.823 10:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:40.081 [2024-07-12 10:38:15.103656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:40.081 [2024-07-12 10:38:15.103695] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x101e000 00:11:40.081 [2024-07-12 10:38:15.103704] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:40.081 [2024-07-12 10:38:15.103892] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf380c0 00:11:40.081 [2024-07-12 10:38:15.104010] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x101e000 00:11:40.081 [2024-07-12 10:38:15.104021] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x101e000 00:11:40.081 [2024-07-12 10:38:15.104175] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:40.081 BaseBdev2 00:11:40.081 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:40.081 10:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:40.081 10:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:40.081 10:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:40.081 10:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:40.081 10:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:40.081 10:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:40.337 10:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:40.337 [ 00:11:40.337 { 00:11:40.337 "name": "BaseBdev2", 00:11:40.337 "aliases": [ 00:11:40.337 "ea4cbd96-a909-4330-add3-bd098254f9ba" 00:11:40.337 ], 00:11:40.337 "product_name": "Malloc disk", 00:11:40.337 "block_size": 512, 00:11:40.337 "num_blocks": 65536, 00:11:40.337 "uuid": "ea4cbd96-a909-4330-add3-bd098254f9ba", 00:11:40.337 "assigned_rate_limits": { 00:11:40.337 "rw_ios_per_sec": 0, 00:11:40.337 "rw_mbytes_per_sec": 0, 00:11:40.337 "r_mbytes_per_sec": 0, 00:11:40.337 "w_mbytes_per_sec": 0 00:11:40.337 }, 00:11:40.337 "claimed": true, 00:11:40.337 "claim_type": "exclusive_write", 00:11:40.337 "zoned": false, 00:11:40.337 "supported_io_types": { 00:11:40.337 "read": true, 00:11:40.337 "write": true, 00:11:40.337 "unmap": true, 00:11:40.337 "flush": true, 00:11:40.337 "reset": true, 00:11:40.337 "nvme_admin": false, 00:11:40.337 "nvme_io": false, 00:11:40.337 "nvme_io_md": false, 00:11:40.337 "write_zeroes": true, 00:11:40.337 "zcopy": true, 00:11:40.337 "get_zone_info": false, 00:11:40.337 "zone_management": false, 00:11:40.337 "zone_append": false, 00:11:40.337 "compare": false, 00:11:40.337 "compare_and_write": false, 00:11:40.337 "abort": true, 00:11:40.337 "seek_hole": false, 00:11:40.337 "seek_data": false, 00:11:40.337 "copy": true, 00:11:40.337 "nvme_iov_md": false 00:11:40.337 }, 00:11:40.337 "memory_domains": [ 00:11:40.337 { 00:11:40.337 "dma_device_id": "system", 00:11:40.337 "dma_device_type": 1 00:11:40.337 }, 00:11:40.337 { 00:11:40.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.337 "dma_device_type": 2 00:11:40.337 } 00:11:40.337 ], 00:11:40.337 "driver_specific": {} 00:11:40.337 } 00:11:40.337 ] 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:40.594 "name": "Existed_Raid", 00:11:40.594 "uuid": "21a36b3b-56cd-46f3-ab93-252be488f273", 00:11:40.594 "strip_size_kb": 64, 00:11:40.594 "state": "online", 00:11:40.594 "raid_level": "concat", 00:11:40.594 "superblock": false, 00:11:40.594 "num_base_bdevs": 2, 00:11:40.594 "num_base_bdevs_discovered": 2, 00:11:40.594 "num_base_bdevs_operational": 2, 00:11:40.594 "base_bdevs_list": [ 00:11:40.594 { 00:11:40.594 "name": "BaseBdev1", 00:11:40.594 "uuid": "bf083a4f-71c1-4f0d-a4f2-c5c130ec5868", 00:11:40.594 "is_configured": true, 00:11:40.594 "data_offset": 0, 00:11:40.594 "data_size": 65536 00:11:40.594 }, 00:11:40.594 { 00:11:40.594 "name": "BaseBdev2", 00:11:40.594 "uuid": "ea4cbd96-a909-4330-add3-bd098254f9ba", 00:11:40.594 "is_configured": true, 00:11:40.594 "data_offset": 0, 00:11:40.594 "data_size": 65536 00:11:40.594 } 00:11:40.594 ] 00:11:40.594 }' 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:40.594 10:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:41.159 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:41.159 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:41.159 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:41.159 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:41.159 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:41.159 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:41.159 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:41.159 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:41.417 [2024-07-12 10:38:16.475569] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:41.418 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:41.418 "name": "Existed_Raid", 00:11:41.418 "aliases": [ 00:11:41.418 "21a36b3b-56cd-46f3-ab93-252be488f273" 00:11:41.418 ], 00:11:41.418 "product_name": "Raid Volume", 00:11:41.418 "block_size": 512, 00:11:41.418 "num_blocks": 131072, 00:11:41.418 "uuid": "21a36b3b-56cd-46f3-ab93-252be488f273", 00:11:41.418 "assigned_rate_limits": { 00:11:41.418 "rw_ios_per_sec": 0, 00:11:41.418 "rw_mbytes_per_sec": 0, 00:11:41.418 "r_mbytes_per_sec": 0, 00:11:41.418 "w_mbytes_per_sec": 0 00:11:41.418 }, 00:11:41.418 "claimed": false, 00:11:41.418 "zoned": false, 00:11:41.418 "supported_io_types": { 00:11:41.418 "read": true, 00:11:41.418 "write": true, 00:11:41.418 "unmap": true, 00:11:41.418 "flush": true, 00:11:41.418 "reset": true, 00:11:41.418 "nvme_admin": false, 00:11:41.418 "nvme_io": false, 00:11:41.418 "nvme_io_md": false, 00:11:41.418 "write_zeroes": true, 00:11:41.418 "zcopy": false, 00:11:41.418 "get_zone_info": false, 00:11:41.418 "zone_management": false, 00:11:41.418 "zone_append": false, 00:11:41.418 "compare": false, 00:11:41.418 "compare_and_write": false, 00:11:41.418 "abort": false, 00:11:41.418 "seek_hole": false, 00:11:41.418 "seek_data": false, 00:11:41.418 "copy": false, 00:11:41.418 "nvme_iov_md": false 00:11:41.418 }, 00:11:41.418 "memory_domains": [ 00:11:41.418 { 00:11:41.418 "dma_device_id": "system", 00:11:41.418 "dma_device_type": 1 00:11:41.418 }, 00:11:41.418 { 00:11:41.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.418 "dma_device_type": 2 00:11:41.418 }, 00:11:41.418 { 00:11:41.418 "dma_device_id": "system", 00:11:41.418 "dma_device_type": 1 00:11:41.418 }, 00:11:41.418 { 00:11:41.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.418 "dma_device_type": 2 00:11:41.418 } 00:11:41.418 ], 00:11:41.418 "driver_specific": { 00:11:41.418 "raid": { 00:11:41.418 "uuid": "21a36b3b-56cd-46f3-ab93-252be488f273", 00:11:41.418 "strip_size_kb": 64, 00:11:41.418 "state": "online", 00:11:41.418 "raid_level": "concat", 00:11:41.418 "superblock": false, 00:11:41.418 "num_base_bdevs": 2, 00:11:41.418 "num_base_bdevs_discovered": 2, 00:11:41.418 "num_base_bdevs_operational": 2, 00:11:41.418 "base_bdevs_list": [ 00:11:41.418 { 00:11:41.418 "name": "BaseBdev1", 00:11:41.418 "uuid": "bf083a4f-71c1-4f0d-a4f2-c5c130ec5868", 00:11:41.418 "is_configured": true, 00:11:41.418 "data_offset": 0, 00:11:41.418 "data_size": 65536 00:11:41.418 }, 00:11:41.418 { 00:11:41.418 "name": "BaseBdev2", 00:11:41.418 "uuid": "ea4cbd96-a909-4330-add3-bd098254f9ba", 00:11:41.418 "is_configured": true, 00:11:41.418 "data_offset": 0, 00:11:41.418 "data_size": 65536 00:11:41.418 } 00:11:41.418 ] 00:11:41.418 } 00:11:41.418 } 00:11:41.418 }' 00:11:41.418 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:41.418 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:41.418 BaseBdev2' 00:11:41.418 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:41.418 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:41.418 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:41.676 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:41.676 "name": "BaseBdev1", 00:11:41.676 "aliases": [ 00:11:41.676 "bf083a4f-71c1-4f0d-a4f2-c5c130ec5868" 00:11:41.676 ], 00:11:41.676 "product_name": "Malloc disk", 00:11:41.676 "block_size": 512, 00:11:41.676 "num_blocks": 65536, 00:11:41.676 "uuid": "bf083a4f-71c1-4f0d-a4f2-c5c130ec5868", 00:11:41.676 "assigned_rate_limits": { 00:11:41.676 "rw_ios_per_sec": 0, 00:11:41.676 "rw_mbytes_per_sec": 0, 00:11:41.676 "r_mbytes_per_sec": 0, 00:11:41.676 "w_mbytes_per_sec": 0 00:11:41.676 }, 00:11:41.676 "claimed": true, 00:11:41.676 "claim_type": "exclusive_write", 00:11:41.676 "zoned": false, 00:11:41.676 "supported_io_types": { 00:11:41.676 "read": true, 00:11:41.676 "write": true, 00:11:41.676 "unmap": true, 00:11:41.676 "flush": true, 00:11:41.676 "reset": true, 00:11:41.676 "nvme_admin": false, 00:11:41.676 "nvme_io": false, 00:11:41.676 "nvme_io_md": false, 00:11:41.676 "write_zeroes": true, 00:11:41.676 "zcopy": true, 00:11:41.676 "get_zone_info": false, 00:11:41.676 "zone_management": false, 00:11:41.676 "zone_append": false, 00:11:41.676 "compare": false, 00:11:41.676 "compare_and_write": false, 00:11:41.676 "abort": true, 00:11:41.676 "seek_hole": false, 00:11:41.676 "seek_data": false, 00:11:41.676 "copy": true, 00:11:41.676 "nvme_iov_md": false 00:11:41.676 }, 00:11:41.676 "memory_domains": [ 00:11:41.676 { 00:11:41.676 "dma_device_id": "system", 00:11:41.676 "dma_device_type": 1 00:11:41.676 }, 00:11:41.676 { 00:11:41.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.676 "dma_device_type": 2 00:11:41.676 } 00:11:41.676 ], 00:11:41.676 "driver_specific": {} 00:11:41.676 }' 00:11:41.676 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.676 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.933 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:41.933 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.933 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.933 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:41.933 10:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.933 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.933 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:41.933 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:41.933 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:42.190 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:42.190 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:42.190 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:42.190 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:42.448 "name": "BaseBdev2", 00:11:42.448 "aliases": [ 00:11:42.448 "ea4cbd96-a909-4330-add3-bd098254f9ba" 00:11:42.448 ], 00:11:42.448 "product_name": "Malloc disk", 00:11:42.448 "block_size": 512, 00:11:42.448 "num_blocks": 65536, 00:11:42.448 "uuid": "ea4cbd96-a909-4330-add3-bd098254f9ba", 00:11:42.448 "assigned_rate_limits": { 00:11:42.448 "rw_ios_per_sec": 0, 00:11:42.448 "rw_mbytes_per_sec": 0, 00:11:42.448 "r_mbytes_per_sec": 0, 00:11:42.448 "w_mbytes_per_sec": 0 00:11:42.448 }, 00:11:42.448 "claimed": true, 00:11:42.448 "claim_type": "exclusive_write", 00:11:42.448 "zoned": false, 00:11:42.448 "supported_io_types": { 00:11:42.448 "read": true, 00:11:42.448 "write": true, 00:11:42.448 "unmap": true, 00:11:42.448 "flush": true, 00:11:42.448 "reset": true, 00:11:42.448 "nvme_admin": false, 00:11:42.448 "nvme_io": false, 00:11:42.448 "nvme_io_md": false, 00:11:42.448 "write_zeroes": true, 00:11:42.448 "zcopy": true, 00:11:42.448 "get_zone_info": false, 00:11:42.448 "zone_management": false, 00:11:42.448 "zone_append": false, 00:11:42.448 "compare": false, 00:11:42.448 "compare_and_write": false, 00:11:42.448 "abort": true, 00:11:42.448 "seek_hole": false, 00:11:42.448 "seek_data": false, 00:11:42.448 "copy": true, 00:11:42.448 "nvme_iov_md": false 00:11:42.448 }, 00:11:42.448 "memory_domains": [ 00:11:42.448 { 00:11:42.448 "dma_device_id": "system", 00:11:42.448 "dma_device_type": 1 00:11:42.448 }, 00:11:42.448 { 00:11:42.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.448 "dma_device_type": 2 00:11:42.448 } 00:11:42.448 ], 00:11:42.448 "driver_specific": {} 00:11:42.448 }' 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:42.448 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:42.709 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:42.709 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:42.709 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:42.967 [2024-07-12 10:38:17.935215] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:42.968 [2024-07-12 10:38:17.935242] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:42.968 [2024-07-12 10:38:17.935281] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.968 10:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:43.226 10:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:43.226 "name": "Existed_Raid", 00:11:43.226 "uuid": "21a36b3b-56cd-46f3-ab93-252be488f273", 00:11:43.226 "strip_size_kb": 64, 00:11:43.226 "state": "offline", 00:11:43.226 "raid_level": "concat", 00:11:43.226 "superblock": false, 00:11:43.226 "num_base_bdevs": 2, 00:11:43.226 "num_base_bdevs_discovered": 1, 00:11:43.226 "num_base_bdevs_operational": 1, 00:11:43.226 "base_bdevs_list": [ 00:11:43.226 { 00:11:43.226 "name": null, 00:11:43.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.226 "is_configured": false, 00:11:43.226 "data_offset": 0, 00:11:43.226 "data_size": 65536 00:11:43.226 }, 00:11:43.226 { 00:11:43.226 "name": "BaseBdev2", 00:11:43.226 "uuid": "ea4cbd96-a909-4330-add3-bd098254f9ba", 00:11:43.226 "is_configured": true, 00:11:43.226 "data_offset": 0, 00:11:43.226 "data_size": 65536 00:11:43.226 } 00:11:43.226 ] 00:11:43.226 }' 00:11:43.226 10:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:43.226 10:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.793 10:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:43.793 10:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:43.793 10:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.793 10:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:44.050 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:44.050 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:44.050 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:44.308 [2024-07-12 10:38:19.300749] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:44.308 [2024-07-12 10:38:19.300796] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101e000 name Existed_Raid, state offline 00:11:44.308 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:44.308 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:44.308 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.308 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:44.598 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:44.598 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:44.598 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:44.598 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2021259 00:11:44.598 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2021259 ']' 00:11:44.598 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2021259 00:11:44.599 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:44.599 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:44.599 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2021259 00:11:44.599 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:44.599 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:44.599 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2021259' 00:11:44.599 killing process with pid 2021259 00:11:44.599 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2021259 00:11:44.599 [2024-07-12 10:38:19.634726] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:44.599 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2021259 00:11:44.599 [2024-07-12 10:38:19.635620] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:44.860 10:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:44.860 00:11:44.860 real 0m10.707s 00:11:44.860 user 0m19.043s 00:11:44.860 sys 0m1.951s 00:11:44.860 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:44.860 10:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:44.860 ************************************ 00:11:44.860 END TEST raid_state_function_test 00:11:44.860 ************************************ 00:11:44.860 10:38:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:44.860 10:38:19 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:44.860 10:38:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:44.860 10:38:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:44.860 10:38:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:44.860 ************************************ 00:11:44.860 START TEST raid_state_function_test_sb 00:11:44.860 ************************************ 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2022892 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2022892' 00:11:44.861 Process raid pid: 2022892 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2022892 /var/tmp/spdk-raid.sock 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2022892 ']' 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:44.861 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:44.861 10:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:44.861 [2024-07-12 10:38:19.998054] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:11:44.861 [2024-07-12 10:38:19.998114] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:45.119 [2024-07-12 10:38:20.130726] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:45.119 [2024-07-12 10:38:20.239774] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:45.119 [2024-07-12 10:38:20.302095] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:45.119 [2024-07-12 10:38:20.302122] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:45.377 10:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:45.377 10:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:45.377 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:45.637 [2024-07-12 10:38:20.699472] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:45.637 [2024-07-12 10:38:20.699519] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:45.637 [2024-07-12 10:38:20.699530] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:45.637 [2024-07-12 10:38:20.699542] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.637 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:45.896 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.896 "name": "Existed_Raid", 00:11:45.896 "uuid": "e4678c72-1d3c-4ab3-8277-5c0b39de6965", 00:11:45.896 "strip_size_kb": 64, 00:11:45.896 "state": "configuring", 00:11:45.896 "raid_level": "concat", 00:11:45.896 "superblock": true, 00:11:45.896 "num_base_bdevs": 2, 00:11:45.896 "num_base_bdevs_discovered": 0, 00:11:45.896 "num_base_bdevs_operational": 2, 00:11:45.896 "base_bdevs_list": [ 00:11:45.896 { 00:11:45.896 "name": "BaseBdev1", 00:11:45.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.896 "is_configured": false, 00:11:45.896 "data_offset": 0, 00:11:45.896 "data_size": 0 00:11:45.896 }, 00:11:45.896 { 00:11:45.896 "name": "BaseBdev2", 00:11:45.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.896 "is_configured": false, 00:11:45.896 "data_offset": 0, 00:11:45.896 "data_size": 0 00:11:45.896 } 00:11:45.896 ] 00:11:45.896 }' 00:11:45.896 10:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.896 10:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:46.461 10:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:46.718 [2024-07-12 10:38:21.722042] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:46.718 [2024-07-12 10:38:21.722072] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x253aa80 name Existed_Raid, state configuring 00:11:46.718 10:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:46.975 [2024-07-12 10:38:21.970716] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:46.975 [2024-07-12 10:38:21.970743] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:46.975 [2024-07-12 10:38:21.970752] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:46.975 [2024-07-12 10:38:21.970764] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:46.975 10:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:46.975 [2024-07-12 10:38:22.161070] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:46.975 BaseBdev1 00:11:47.231 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:47.231 10:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:47.231 10:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:47.231 10:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:47.231 10:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:47.231 10:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:47.231 10:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:47.231 10:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:47.488 [ 00:11:47.488 { 00:11:47.488 "name": "BaseBdev1", 00:11:47.488 "aliases": [ 00:11:47.488 "1c42b704-7679-4a26-ab4c-6abb7167a4ab" 00:11:47.488 ], 00:11:47.488 "product_name": "Malloc disk", 00:11:47.488 "block_size": 512, 00:11:47.488 "num_blocks": 65536, 00:11:47.488 "uuid": "1c42b704-7679-4a26-ab4c-6abb7167a4ab", 00:11:47.488 "assigned_rate_limits": { 00:11:47.488 "rw_ios_per_sec": 0, 00:11:47.488 "rw_mbytes_per_sec": 0, 00:11:47.488 "r_mbytes_per_sec": 0, 00:11:47.488 "w_mbytes_per_sec": 0 00:11:47.488 }, 00:11:47.488 "claimed": true, 00:11:47.488 "claim_type": "exclusive_write", 00:11:47.488 "zoned": false, 00:11:47.488 "supported_io_types": { 00:11:47.488 "read": true, 00:11:47.488 "write": true, 00:11:47.488 "unmap": true, 00:11:47.488 "flush": true, 00:11:47.488 "reset": true, 00:11:47.488 "nvme_admin": false, 00:11:47.488 "nvme_io": false, 00:11:47.488 "nvme_io_md": false, 00:11:47.488 "write_zeroes": true, 00:11:47.488 "zcopy": true, 00:11:47.488 "get_zone_info": false, 00:11:47.488 "zone_management": false, 00:11:47.488 "zone_append": false, 00:11:47.488 "compare": false, 00:11:47.488 "compare_and_write": false, 00:11:47.488 "abort": true, 00:11:47.488 "seek_hole": false, 00:11:47.488 "seek_data": false, 00:11:47.488 "copy": true, 00:11:47.488 "nvme_iov_md": false 00:11:47.488 }, 00:11:47.488 "memory_domains": [ 00:11:47.488 { 00:11:47.488 "dma_device_id": "system", 00:11:47.488 "dma_device_type": 1 00:11:47.488 }, 00:11:47.488 { 00:11:47.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.488 "dma_device_type": 2 00:11:47.488 } 00:11:47.488 ], 00:11:47.488 "driver_specific": {} 00:11:47.488 } 00:11:47.488 ] 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.488 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.745 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.745 "name": "Existed_Raid", 00:11:47.745 "uuid": "0c246015-501f-4b83-af6d-c04dfd563036", 00:11:47.745 "strip_size_kb": 64, 00:11:47.745 "state": "configuring", 00:11:47.745 "raid_level": "concat", 00:11:47.745 "superblock": true, 00:11:47.745 "num_base_bdevs": 2, 00:11:47.745 "num_base_bdevs_discovered": 1, 00:11:47.745 "num_base_bdevs_operational": 2, 00:11:47.745 "base_bdevs_list": [ 00:11:47.745 { 00:11:47.745 "name": "BaseBdev1", 00:11:47.745 "uuid": "1c42b704-7679-4a26-ab4c-6abb7167a4ab", 00:11:47.745 "is_configured": true, 00:11:47.745 "data_offset": 2048, 00:11:47.745 "data_size": 63488 00:11:47.745 }, 00:11:47.745 { 00:11:47.745 "name": "BaseBdev2", 00:11:47.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.745 "is_configured": false, 00:11:47.745 "data_offset": 0, 00:11:47.745 "data_size": 0 00:11:47.745 } 00:11:47.745 ] 00:11:47.745 }' 00:11:47.745 10:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.745 10:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:48.310 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:48.568 [2024-07-12 10:38:23.520692] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:48.568 [2024-07-12 10:38:23.520732] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x253a350 name Existed_Raid, state configuring 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:48.568 [2024-07-12 10:38:23.697199] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:48.568 [2024-07-12 10:38:23.698722] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:48.568 [2024-07-12 10:38:23.698757] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.568 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:48.825 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.825 "name": "Existed_Raid", 00:11:48.825 "uuid": "be00063f-24b3-466c-8c78-ad72f1ef8f3b", 00:11:48.825 "strip_size_kb": 64, 00:11:48.825 "state": "configuring", 00:11:48.825 "raid_level": "concat", 00:11:48.825 "superblock": true, 00:11:48.825 "num_base_bdevs": 2, 00:11:48.825 "num_base_bdevs_discovered": 1, 00:11:48.825 "num_base_bdevs_operational": 2, 00:11:48.825 "base_bdevs_list": [ 00:11:48.825 { 00:11:48.825 "name": "BaseBdev1", 00:11:48.825 "uuid": "1c42b704-7679-4a26-ab4c-6abb7167a4ab", 00:11:48.825 "is_configured": true, 00:11:48.825 "data_offset": 2048, 00:11:48.825 "data_size": 63488 00:11:48.825 }, 00:11:48.825 { 00:11:48.825 "name": "BaseBdev2", 00:11:48.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.825 "is_configured": false, 00:11:48.825 "data_offset": 0, 00:11:48.825 "data_size": 0 00:11:48.825 } 00:11:48.826 ] 00:11:48.826 }' 00:11:48.826 10:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.826 10:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.392 10:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:49.651 [2024-07-12 10:38:24.660227] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:49.651 [2024-07-12 10:38:24.660375] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x253b000 00:11:49.651 [2024-07-12 10:38:24.660389] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:49.651 [2024-07-12 10:38:24.660563] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24550c0 00:11:49.651 [2024-07-12 10:38:24.660676] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x253b000 00:11:49.651 [2024-07-12 10:38:24.660687] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x253b000 00:11:49.651 [2024-07-12 10:38:24.660774] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:49.651 BaseBdev2 00:11:49.651 10:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:49.651 10:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:49.651 10:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:49.651 10:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:49.651 10:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:49.651 10:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:49.651 10:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:49.909 10:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:50.168 [ 00:11:50.168 { 00:11:50.168 "name": "BaseBdev2", 00:11:50.168 "aliases": [ 00:11:50.168 "6d4c5333-7f60-4922-a585-e4eb423bdf9a" 00:11:50.168 ], 00:11:50.168 "product_name": "Malloc disk", 00:11:50.168 "block_size": 512, 00:11:50.168 "num_blocks": 65536, 00:11:50.168 "uuid": "6d4c5333-7f60-4922-a585-e4eb423bdf9a", 00:11:50.168 "assigned_rate_limits": { 00:11:50.168 "rw_ios_per_sec": 0, 00:11:50.168 "rw_mbytes_per_sec": 0, 00:11:50.168 "r_mbytes_per_sec": 0, 00:11:50.168 "w_mbytes_per_sec": 0 00:11:50.168 }, 00:11:50.168 "claimed": true, 00:11:50.168 "claim_type": "exclusive_write", 00:11:50.168 "zoned": false, 00:11:50.168 "supported_io_types": { 00:11:50.168 "read": true, 00:11:50.168 "write": true, 00:11:50.168 "unmap": true, 00:11:50.168 "flush": true, 00:11:50.168 "reset": true, 00:11:50.168 "nvme_admin": false, 00:11:50.168 "nvme_io": false, 00:11:50.168 "nvme_io_md": false, 00:11:50.168 "write_zeroes": true, 00:11:50.168 "zcopy": true, 00:11:50.168 "get_zone_info": false, 00:11:50.168 "zone_management": false, 00:11:50.168 "zone_append": false, 00:11:50.168 "compare": false, 00:11:50.168 "compare_and_write": false, 00:11:50.168 "abort": true, 00:11:50.168 "seek_hole": false, 00:11:50.168 "seek_data": false, 00:11:50.168 "copy": true, 00:11:50.168 "nvme_iov_md": false 00:11:50.168 }, 00:11:50.168 "memory_domains": [ 00:11:50.168 { 00:11:50.168 "dma_device_id": "system", 00:11:50.168 "dma_device_type": 1 00:11:50.168 }, 00:11:50.168 { 00:11:50.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.168 "dma_device_type": 2 00:11:50.168 } 00:11:50.168 ], 00:11:50.168 "driver_specific": {} 00:11:50.168 } 00:11:50.168 ] 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.168 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.428 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.428 "name": "Existed_Raid", 00:11:50.428 "uuid": "be00063f-24b3-466c-8c78-ad72f1ef8f3b", 00:11:50.428 "strip_size_kb": 64, 00:11:50.428 "state": "online", 00:11:50.428 "raid_level": "concat", 00:11:50.428 "superblock": true, 00:11:50.428 "num_base_bdevs": 2, 00:11:50.428 "num_base_bdevs_discovered": 2, 00:11:50.428 "num_base_bdevs_operational": 2, 00:11:50.428 "base_bdevs_list": [ 00:11:50.428 { 00:11:50.428 "name": "BaseBdev1", 00:11:50.428 "uuid": "1c42b704-7679-4a26-ab4c-6abb7167a4ab", 00:11:50.428 "is_configured": true, 00:11:50.428 "data_offset": 2048, 00:11:50.428 "data_size": 63488 00:11:50.428 }, 00:11:50.428 { 00:11:50.428 "name": "BaseBdev2", 00:11:50.428 "uuid": "6d4c5333-7f60-4922-a585-e4eb423bdf9a", 00:11:50.428 "is_configured": true, 00:11:50.428 "data_offset": 2048, 00:11:50.428 "data_size": 63488 00:11:50.428 } 00:11:50.428 ] 00:11:50.428 }' 00:11:50.428 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.428 10:38:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:50.995 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:50.995 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:50.995 10:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:50.995 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:50.995 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:50.995 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:50.995 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:50.995 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:51.254 [2024-07-12 10:38:26.228657] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:51.254 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:51.254 "name": "Existed_Raid", 00:11:51.254 "aliases": [ 00:11:51.254 "be00063f-24b3-466c-8c78-ad72f1ef8f3b" 00:11:51.254 ], 00:11:51.254 "product_name": "Raid Volume", 00:11:51.254 "block_size": 512, 00:11:51.254 "num_blocks": 126976, 00:11:51.254 "uuid": "be00063f-24b3-466c-8c78-ad72f1ef8f3b", 00:11:51.254 "assigned_rate_limits": { 00:11:51.254 "rw_ios_per_sec": 0, 00:11:51.254 "rw_mbytes_per_sec": 0, 00:11:51.254 "r_mbytes_per_sec": 0, 00:11:51.254 "w_mbytes_per_sec": 0 00:11:51.254 }, 00:11:51.254 "claimed": false, 00:11:51.254 "zoned": false, 00:11:51.254 "supported_io_types": { 00:11:51.254 "read": true, 00:11:51.254 "write": true, 00:11:51.254 "unmap": true, 00:11:51.254 "flush": true, 00:11:51.254 "reset": true, 00:11:51.254 "nvme_admin": false, 00:11:51.254 "nvme_io": false, 00:11:51.254 "nvme_io_md": false, 00:11:51.254 "write_zeroes": true, 00:11:51.254 "zcopy": false, 00:11:51.254 "get_zone_info": false, 00:11:51.254 "zone_management": false, 00:11:51.254 "zone_append": false, 00:11:51.254 "compare": false, 00:11:51.254 "compare_and_write": false, 00:11:51.254 "abort": false, 00:11:51.254 "seek_hole": false, 00:11:51.254 "seek_data": false, 00:11:51.254 "copy": false, 00:11:51.254 "nvme_iov_md": false 00:11:51.254 }, 00:11:51.254 "memory_domains": [ 00:11:51.254 { 00:11:51.254 "dma_device_id": "system", 00:11:51.254 "dma_device_type": 1 00:11:51.254 }, 00:11:51.254 { 00:11:51.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.254 "dma_device_type": 2 00:11:51.254 }, 00:11:51.254 { 00:11:51.254 "dma_device_id": "system", 00:11:51.254 "dma_device_type": 1 00:11:51.254 }, 00:11:51.254 { 00:11:51.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.254 "dma_device_type": 2 00:11:51.254 } 00:11:51.254 ], 00:11:51.254 "driver_specific": { 00:11:51.254 "raid": { 00:11:51.254 "uuid": "be00063f-24b3-466c-8c78-ad72f1ef8f3b", 00:11:51.254 "strip_size_kb": 64, 00:11:51.254 "state": "online", 00:11:51.254 "raid_level": "concat", 00:11:51.254 "superblock": true, 00:11:51.254 "num_base_bdevs": 2, 00:11:51.254 "num_base_bdevs_discovered": 2, 00:11:51.254 "num_base_bdevs_operational": 2, 00:11:51.254 "base_bdevs_list": [ 00:11:51.254 { 00:11:51.254 "name": "BaseBdev1", 00:11:51.254 "uuid": "1c42b704-7679-4a26-ab4c-6abb7167a4ab", 00:11:51.254 "is_configured": true, 00:11:51.254 "data_offset": 2048, 00:11:51.254 "data_size": 63488 00:11:51.254 }, 00:11:51.254 { 00:11:51.254 "name": "BaseBdev2", 00:11:51.254 "uuid": "6d4c5333-7f60-4922-a585-e4eb423bdf9a", 00:11:51.254 "is_configured": true, 00:11:51.254 "data_offset": 2048, 00:11:51.254 "data_size": 63488 00:11:51.254 } 00:11:51.254 ] 00:11:51.254 } 00:11:51.254 } 00:11:51.254 }' 00:11:51.254 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:51.254 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:51.254 BaseBdev2' 00:11:51.254 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.254 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:51.254 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:51.513 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:51.513 "name": "BaseBdev1", 00:11:51.513 "aliases": [ 00:11:51.513 "1c42b704-7679-4a26-ab4c-6abb7167a4ab" 00:11:51.513 ], 00:11:51.513 "product_name": "Malloc disk", 00:11:51.513 "block_size": 512, 00:11:51.513 "num_blocks": 65536, 00:11:51.513 "uuid": "1c42b704-7679-4a26-ab4c-6abb7167a4ab", 00:11:51.513 "assigned_rate_limits": { 00:11:51.513 "rw_ios_per_sec": 0, 00:11:51.513 "rw_mbytes_per_sec": 0, 00:11:51.513 "r_mbytes_per_sec": 0, 00:11:51.513 "w_mbytes_per_sec": 0 00:11:51.513 }, 00:11:51.513 "claimed": true, 00:11:51.513 "claim_type": "exclusive_write", 00:11:51.513 "zoned": false, 00:11:51.513 "supported_io_types": { 00:11:51.513 "read": true, 00:11:51.513 "write": true, 00:11:51.513 "unmap": true, 00:11:51.513 "flush": true, 00:11:51.513 "reset": true, 00:11:51.513 "nvme_admin": false, 00:11:51.513 "nvme_io": false, 00:11:51.513 "nvme_io_md": false, 00:11:51.513 "write_zeroes": true, 00:11:51.513 "zcopy": true, 00:11:51.513 "get_zone_info": false, 00:11:51.513 "zone_management": false, 00:11:51.513 "zone_append": false, 00:11:51.513 "compare": false, 00:11:51.513 "compare_and_write": false, 00:11:51.513 "abort": true, 00:11:51.513 "seek_hole": false, 00:11:51.513 "seek_data": false, 00:11:51.513 "copy": true, 00:11:51.513 "nvme_iov_md": false 00:11:51.513 }, 00:11:51.513 "memory_domains": [ 00:11:51.513 { 00:11:51.513 "dma_device_id": "system", 00:11:51.513 "dma_device_type": 1 00:11:51.513 }, 00:11:51.513 { 00:11:51.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.513 "dma_device_type": 2 00:11:51.513 } 00:11:51.513 ], 00:11:51.513 "driver_specific": {} 00:11:51.513 }' 00:11:51.513 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.513 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.513 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:51.513 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.513 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:51.772 10:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:52.031 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:52.031 "name": "BaseBdev2", 00:11:52.031 "aliases": [ 00:11:52.031 "6d4c5333-7f60-4922-a585-e4eb423bdf9a" 00:11:52.031 ], 00:11:52.031 "product_name": "Malloc disk", 00:11:52.031 "block_size": 512, 00:11:52.031 "num_blocks": 65536, 00:11:52.031 "uuid": "6d4c5333-7f60-4922-a585-e4eb423bdf9a", 00:11:52.031 "assigned_rate_limits": { 00:11:52.031 "rw_ios_per_sec": 0, 00:11:52.031 "rw_mbytes_per_sec": 0, 00:11:52.031 "r_mbytes_per_sec": 0, 00:11:52.031 "w_mbytes_per_sec": 0 00:11:52.031 }, 00:11:52.031 "claimed": true, 00:11:52.031 "claim_type": "exclusive_write", 00:11:52.031 "zoned": false, 00:11:52.031 "supported_io_types": { 00:11:52.031 "read": true, 00:11:52.031 "write": true, 00:11:52.031 "unmap": true, 00:11:52.031 "flush": true, 00:11:52.031 "reset": true, 00:11:52.031 "nvme_admin": false, 00:11:52.031 "nvme_io": false, 00:11:52.031 "nvme_io_md": false, 00:11:52.031 "write_zeroes": true, 00:11:52.031 "zcopy": true, 00:11:52.031 "get_zone_info": false, 00:11:52.031 "zone_management": false, 00:11:52.031 "zone_append": false, 00:11:52.031 "compare": false, 00:11:52.031 "compare_and_write": false, 00:11:52.031 "abort": true, 00:11:52.031 "seek_hole": false, 00:11:52.031 "seek_data": false, 00:11:52.031 "copy": true, 00:11:52.031 "nvme_iov_md": false 00:11:52.031 }, 00:11:52.031 "memory_domains": [ 00:11:52.031 { 00:11:52.031 "dma_device_id": "system", 00:11:52.031 "dma_device_type": 1 00:11:52.031 }, 00:11:52.031 { 00:11:52.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.031 "dma_device_type": 2 00:11:52.031 } 00:11:52.031 ], 00:11:52.031 "driver_specific": {} 00:11:52.031 }' 00:11:52.031 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.031 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.031 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:52.031 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.289 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.289 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:52.289 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.289 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.289 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.289 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.289 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.289 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:52.289 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:52.856 [2024-07-12 10:38:27.944989] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:52.856 [2024-07-12 10:38:27.945016] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:52.856 [2024-07-12 10:38:27.945056] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.856 10:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.114 10:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.115 "name": "Existed_Raid", 00:11:53.115 "uuid": "be00063f-24b3-466c-8c78-ad72f1ef8f3b", 00:11:53.115 "strip_size_kb": 64, 00:11:53.115 "state": "offline", 00:11:53.115 "raid_level": "concat", 00:11:53.115 "superblock": true, 00:11:53.115 "num_base_bdevs": 2, 00:11:53.115 "num_base_bdevs_discovered": 1, 00:11:53.115 "num_base_bdevs_operational": 1, 00:11:53.115 "base_bdevs_list": [ 00:11:53.115 { 00:11:53.115 "name": null, 00:11:53.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.115 "is_configured": false, 00:11:53.115 "data_offset": 2048, 00:11:53.115 "data_size": 63488 00:11:53.115 }, 00:11:53.115 { 00:11:53.115 "name": "BaseBdev2", 00:11:53.115 "uuid": "6d4c5333-7f60-4922-a585-e4eb423bdf9a", 00:11:53.115 "is_configured": true, 00:11:53.115 "data_offset": 2048, 00:11:53.115 "data_size": 63488 00:11:53.115 } 00:11:53.115 ] 00:11:53.115 }' 00:11:53.115 10:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.115 10:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:53.681 10:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:53.681 10:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:53.681 10:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.681 10:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:53.940 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:53.940 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:53.940 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:54.199 [2024-07-12 10:38:29.294464] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:54.199 [2024-07-12 10:38:29.294523] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x253b000 name Existed_Raid, state offline 00:11:54.199 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:54.199 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:54.199 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.199 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2022892 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2022892 ']' 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2022892 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2022892 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2022892' 00:11:54.457 killing process with pid 2022892 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2022892 00:11:54.457 [2024-07-12 10:38:29.617258] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:54.457 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2022892 00:11:54.457 [2024-07-12 10:38:29.618112] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:54.716 10:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:54.716 00:11:54.716 real 0m9.885s 00:11:54.716 user 0m17.952s 00:11:54.716 sys 0m1.927s 00:11:54.716 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:54.716 10:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.716 ************************************ 00:11:54.716 END TEST raid_state_function_test_sb 00:11:54.716 ************************************ 00:11:54.716 10:38:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:54.716 10:38:29 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:54.716 10:38:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:54.716 10:38:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:54.716 10:38:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:54.716 ************************************ 00:11:54.716 START TEST raid_superblock_test 00:11:54.716 ************************************ 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2024468 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2024468 /var/tmp/spdk-raid.sock 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2024468 ']' 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:54.716 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:54.716 10:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.975 [2024-07-12 10:38:29.948120] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:11:54.975 [2024-07-12 10:38:29.948191] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2024468 ] 00:11:54.975 [2024-07-12 10:38:30.076906] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:55.233 [2024-07-12 10:38:30.179751] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:55.233 [2024-07-12 10:38:30.250872] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:55.233 [2024-07-12 10:38:30.250926] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:55.800 10:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:56.064 malloc1 00:11:56.064 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:56.322 [2024-07-12 10:38:31.285129] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:56.322 [2024-07-12 10:38:31.285177] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:56.322 [2024-07-12 10:38:31.285199] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x245a570 00:11:56.322 [2024-07-12 10:38:31.285213] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:56.322 [2024-07-12 10:38:31.286953] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:56.322 [2024-07-12 10:38:31.286983] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:56.322 pt1 00:11:56.322 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:56.322 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:56.322 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:56.322 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:56.322 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:56.322 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:56.322 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:56.322 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:56.322 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:56.580 malloc2 00:11:56.580 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:56.580 [2024-07-12 10:38:31.768429] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:56.580 [2024-07-12 10:38:31.768474] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:56.580 [2024-07-12 10:38:31.768497] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x245b970 00:11:56.580 [2024-07-12 10:38:31.768510] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:56.580 [2024-07-12 10:38:31.770110] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:56.580 [2024-07-12 10:38:31.770139] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:56.580 pt2 00:11:56.838 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:56.838 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:56.838 10:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:56.838 [2024-07-12 10:38:32.005077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:56.838 [2024-07-12 10:38:32.006431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:56.838 [2024-07-12 10:38:32.006589] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25fe270 00:11:56.838 [2024-07-12 10:38:32.006602] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:56.838 [2024-07-12 10:38:32.006800] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25f3c10 00:11:56.838 [2024-07-12 10:38:32.006945] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25fe270 00:11:56.838 [2024-07-12 10:38:32.006955] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25fe270 00:11:56.838 [2024-07-12 10:38:32.007054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.838 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:57.097 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.097 "name": "raid_bdev1", 00:11:57.097 "uuid": "8096cb55-2bbc-4703-bf8a-d5494ce7befb", 00:11:57.097 "strip_size_kb": 64, 00:11:57.097 "state": "online", 00:11:57.097 "raid_level": "concat", 00:11:57.097 "superblock": true, 00:11:57.097 "num_base_bdevs": 2, 00:11:57.097 "num_base_bdevs_discovered": 2, 00:11:57.097 "num_base_bdevs_operational": 2, 00:11:57.097 "base_bdevs_list": [ 00:11:57.097 { 00:11:57.097 "name": "pt1", 00:11:57.097 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:57.097 "is_configured": true, 00:11:57.097 "data_offset": 2048, 00:11:57.097 "data_size": 63488 00:11:57.097 }, 00:11:57.097 { 00:11:57.097 "name": "pt2", 00:11:57.097 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:57.097 "is_configured": true, 00:11:57.097 "data_offset": 2048, 00:11:57.097 "data_size": 63488 00:11:57.097 } 00:11:57.097 ] 00:11:57.097 }' 00:11:57.097 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.097 10:38:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.032 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:58.032 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:58.032 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:58.032 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:58.032 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:58.033 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:58.033 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:58.033 10:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:58.033 [2024-07-12 10:38:33.120249] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:58.033 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:58.033 "name": "raid_bdev1", 00:11:58.033 "aliases": [ 00:11:58.033 "8096cb55-2bbc-4703-bf8a-d5494ce7befb" 00:11:58.033 ], 00:11:58.033 "product_name": "Raid Volume", 00:11:58.033 "block_size": 512, 00:11:58.033 "num_blocks": 126976, 00:11:58.033 "uuid": "8096cb55-2bbc-4703-bf8a-d5494ce7befb", 00:11:58.033 "assigned_rate_limits": { 00:11:58.033 "rw_ios_per_sec": 0, 00:11:58.033 "rw_mbytes_per_sec": 0, 00:11:58.033 "r_mbytes_per_sec": 0, 00:11:58.033 "w_mbytes_per_sec": 0 00:11:58.033 }, 00:11:58.033 "claimed": false, 00:11:58.033 "zoned": false, 00:11:58.033 "supported_io_types": { 00:11:58.033 "read": true, 00:11:58.033 "write": true, 00:11:58.033 "unmap": true, 00:11:58.033 "flush": true, 00:11:58.033 "reset": true, 00:11:58.033 "nvme_admin": false, 00:11:58.033 "nvme_io": false, 00:11:58.033 "nvme_io_md": false, 00:11:58.033 "write_zeroes": true, 00:11:58.033 "zcopy": false, 00:11:58.033 "get_zone_info": false, 00:11:58.033 "zone_management": false, 00:11:58.033 "zone_append": false, 00:11:58.033 "compare": false, 00:11:58.033 "compare_and_write": false, 00:11:58.033 "abort": false, 00:11:58.033 "seek_hole": false, 00:11:58.033 "seek_data": false, 00:11:58.033 "copy": false, 00:11:58.033 "nvme_iov_md": false 00:11:58.033 }, 00:11:58.033 "memory_domains": [ 00:11:58.033 { 00:11:58.033 "dma_device_id": "system", 00:11:58.033 "dma_device_type": 1 00:11:58.033 }, 00:11:58.033 { 00:11:58.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.033 "dma_device_type": 2 00:11:58.033 }, 00:11:58.033 { 00:11:58.033 "dma_device_id": "system", 00:11:58.033 "dma_device_type": 1 00:11:58.033 }, 00:11:58.033 { 00:11:58.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.033 "dma_device_type": 2 00:11:58.033 } 00:11:58.033 ], 00:11:58.033 "driver_specific": { 00:11:58.033 "raid": { 00:11:58.033 "uuid": "8096cb55-2bbc-4703-bf8a-d5494ce7befb", 00:11:58.033 "strip_size_kb": 64, 00:11:58.033 "state": "online", 00:11:58.033 "raid_level": "concat", 00:11:58.033 "superblock": true, 00:11:58.033 "num_base_bdevs": 2, 00:11:58.033 "num_base_bdevs_discovered": 2, 00:11:58.033 "num_base_bdevs_operational": 2, 00:11:58.033 "base_bdevs_list": [ 00:11:58.033 { 00:11:58.033 "name": "pt1", 00:11:58.033 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:58.033 "is_configured": true, 00:11:58.033 "data_offset": 2048, 00:11:58.033 "data_size": 63488 00:11:58.033 }, 00:11:58.033 { 00:11:58.033 "name": "pt2", 00:11:58.033 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:58.033 "is_configured": true, 00:11:58.033 "data_offset": 2048, 00:11:58.033 "data_size": 63488 00:11:58.033 } 00:11:58.033 ] 00:11:58.033 } 00:11:58.033 } 00:11:58.033 }' 00:11:58.033 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:58.033 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:58.033 pt2' 00:11:58.033 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:58.033 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:58.033 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:58.292 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:58.292 "name": "pt1", 00:11:58.292 "aliases": [ 00:11:58.292 "00000000-0000-0000-0000-000000000001" 00:11:58.292 ], 00:11:58.292 "product_name": "passthru", 00:11:58.292 "block_size": 512, 00:11:58.292 "num_blocks": 65536, 00:11:58.292 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:58.292 "assigned_rate_limits": { 00:11:58.292 "rw_ios_per_sec": 0, 00:11:58.292 "rw_mbytes_per_sec": 0, 00:11:58.292 "r_mbytes_per_sec": 0, 00:11:58.292 "w_mbytes_per_sec": 0 00:11:58.292 }, 00:11:58.292 "claimed": true, 00:11:58.292 "claim_type": "exclusive_write", 00:11:58.292 "zoned": false, 00:11:58.292 "supported_io_types": { 00:11:58.292 "read": true, 00:11:58.292 "write": true, 00:11:58.292 "unmap": true, 00:11:58.292 "flush": true, 00:11:58.292 "reset": true, 00:11:58.292 "nvme_admin": false, 00:11:58.292 "nvme_io": false, 00:11:58.292 "nvme_io_md": false, 00:11:58.292 "write_zeroes": true, 00:11:58.292 "zcopy": true, 00:11:58.292 "get_zone_info": false, 00:11:58.292 "zone_management": false, 00:11:58.292 "zone_append": false, 00:11:58.292 "compare": false, 00:11:58.292 "compare_and_write": false, 00:11:58.292 "abort": true, 00:11:58.292 "seek_hole": false, 00:11:58.292 "seek_data": false, 00:11:58.292 "copy": true, 00:11:58.292 "nvme_iov_md": false 00:11:58.292 }, 00:11:58.292 "memory_domains": [ 00:11:58.292 { 00:11:58.292 "dma_device_id": "system", 00:11:58.292 "dma_device_type": 1 00:11:58.292 }, 00:11:58.292 { 00:11:58.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.292 "dma_device_type": 2 00:11:58.292 } 00:11:58.292 ], 00:11:58.292 "driver_specific": { 00:11:58.292 "passthru": { 00:11:58.292 "name": "pt1", 00:11:58.292 "base_bdev_name": "malloc1" 00:11:58.292 } 00:11:58.292 } 00:11:58.292 }' 00:11:58.292 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.292 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:58.588 10:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:58.875 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:58.875 "name": "pt2", 00:11:58.875 "aliases": [ 00:11:58.875 "00000000-0000-0000-0000-000000000002" 00:11:58.875 ], 00:11:58.875 "product_name": "passthru", 00:11:58.875 "block_size": 512, 00:11:58.875 "num_blocks": 65536, 00:11:58.875 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:58.875 "assigned_rate_limits": { 00:11:58.875 "rw_ios_per_sec": 0, 00:11:58.875 "rw_mbytes_per_sec": 0, 00:11:58.875 "r_mbytes_per_sec": 0, 00:11:58.875 "w_mbytes_per_sec": 0 00:11:58.875 }, 00:11:58.875 "claimed": true, 00:11:58.875 "claim_type": "exclusive_write", 00:11:58.875 "zoned": false, 00:11:58.875 "supported_io_types": { 00:11:58.875 "read": true, 00:11:58.875 "write": true, 00:11:58.875 "unmap": true, 00:11:58.875 "flush": true, 00:11:58.875 "reset": true, 00:11:58.875 "nvme_admin": false, 00:11:58.875 "nvme_io": false, 00:11:58.875 "nvme_io_md": false, 00:11:58.875 "write_zeroes": true, 00:11:58.875 "zcopy": true, 00:11:58.875 "get_zone_info": false, 00:11:58.875 "zone_management": false, 00:11:58.875 "zone_append": false, 00:11:58.875 "compare": false, 00:11:58.875 "compare_and_write": false, 00:11:58.875 "abort": true, 00:11:58.875 "seek_hole": false, 00:11:58.875 "seek_data": false, 00:11:58.875 "copy": true, 00:11:58.875 "nvme_iov_md": false 00:11:58.875 }, 00:11:58.875 "memory_domains": [ 00:11:58.875 { 00:11:58.875 "dma_device_id": "system", 00:11:58.875 "dma_device_type": 1 00:11:58.875 }, 00:11:58.875 { 00:11:58.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.875 "dma_device_type": 2 00:11:58.875 } 00:11:58.875 ], 00:11:58.875 "driver_specific": { 00:11:58.875 "passthru": { 00:11:58.875 "name": "pt2", 00:11:58.875 "base_bdev_name": "malloc2" 00:11:58.875 } 00:11:58.875 } 00:11:58.875 }' 00:11:58.875 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.875 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:59.137 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:59.394 [2024-07-12 10:38:34.552037] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:59.394 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8096cb55-2bbc-4703-bf8a-d5494ce7befb 00:11:59.394 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8096cb55-2bbc-4703-bf8a-d5494ce7befb ']' 00:11:59.394 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:59.652 [2024-07-12 10:38:34.796438] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:59.652 [2024-07-12 10:38:34.796458] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:59.652 [2024-07-12 10:38:34.796516] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:59.652 [2024-07-12 10:38:34.796560] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:59.652 [2024-07-12 10:38:34.796571] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25fe270 name raid_bdev1, state offline 00:11:59.652 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.652 10:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:59.911 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:59.911 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:59.911 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:59.911 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:00.169 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:00.169 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:00.427 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:00.427 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:00.685 10:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:00.943 [2024-07-12 10:38:36.031663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:00.943 [2024-07-12 10:38:36.033040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:00.943 [2024-07-12 10:38:36.033093] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:00.943 [2024-07-12 10:38:36.033133] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:00.943 [2024-07-12 10:38:36.033152] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:00.943 [2024-07-12 10:38:36.033161] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25fdff0 name raid_bdev1, state configuring 00:12:00.943 request: 00:12:00.943 { 00:12:00.943 "name": "raid_bdev1", 00:12:00.943 "raid_level": "concat", 00:12:00.943 "base_bdevs": [ 00:12:00.943 "malloc1", 00:12:00.943 "malloc2" 00:12:00.943 ], 00:12:00.943 "strip_size_kb": 64, 00:12:00.943 "superblock": false, 00:12:00.943 "method": "bdev_raid_create", 00:12:00.943 "req_id": 1 00:12:00.943 } 00:12:00.943 Got JSON-RPC error response 00:12:00.943 response: 00:12:00.943 { 00:12:00.943 "code": -17, 00:12:00.943 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:00.943 } 00:12:00.943 10:38:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:00.943 10:38:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:00.943 10:38:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:00.943 10:38:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:00.943 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.943 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:01.201 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:01.201 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:01.201 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:01.460 [2024-07-12 10:38:36.516881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:01.460 [2024-07-12 10:38:36.516928] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:01.460 [2024-07-12 10:38:36.516952] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x245a7a0 00:12:01.460 [2024-07-12 10:38:36.516965] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:01.460 [2024-07-12 10:38:36.518595] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:01.460 [2024-07-12 10:38:36.518624] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:01.460 [2024-07-12 10:38:36.518693] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:01.460 [2024-07-12 10:38:36.518717] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:01.460 pt1 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.460 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:01.718 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:01.718 "name": "raid_bdev1", 00:12:01.718 "uuid": "8096cb55-2bbc-4703-bf8a-d5494ce7befb", 00:12:01.718 "strip_size_kb": 64, 00:12:01.718 "state": "configuring", 00:12:01.718 "raid_level": "concat", 00:12:01.718 "superblock": true, 00:12:01.718 "num_base_bdevs": 2, 00:12:01.718 "num_base_bdevs_discovered": 1, 00:12:01.718 "num_base_bdevs_operational": 2, 00:12:01.718 "base_bdevs_list": [ 00:12:01.718 { 00:12:01.718 "name": "pt1", 00:12:01.718 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:01.718 "is_configured": true, 00:12:01.718 "data_offset": 2048, 00:12:01.718 "data_size": 63488 00:12:01.718 }, 00:12:01.718 { 00:12:01.718 "name": null, 00:12:01.718 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:01.718 "is_configured": false, 00:12:01.718 "data_offset": 2048, 00:12:01.718 "data_size": 63488 00:12:01.718 } 00:12:01.718 ] 00:12:01.718 }' 00:12:01.718 10:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:01.718 10:38:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.286 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:02.286 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:02.286 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:02.286 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:02.544 [2024-07-12 10:38:37.527569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:02.544 [2024-07-12 10:38:37.527616] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:02.544 [2024-07-12 10:38:37.527633] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25f4820 00:12:02.544 [2024-07-12 10:38:37.527647] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:02.544 [2024-07-12 10:38:37.527988] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:02.544 [2024-07-12 10:38:37.528008] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:02.544 [2024-07-12 10:38:37.528068] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:02.544 [2024-07-12 10:38:37.528087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:02.544 [2024-07-12 10:38:37.528179] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2450ec0 00:12:02.544 [2024-07-12 10:38:37.528190] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:02.544 [2024-07-12 10:38:37.528357] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2451f00 00:12:02.544 [2024-07-12 10:38:37.528479] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2450ec0 00:12:02.544 [2024-07-12 10:38:37.528499] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2450ec0 00:12:02.544 [2024-07-12 10:38:37.528600] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:02.544 pt2 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.544 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:02.802 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.802 "name": "raid_bdev1", 00:12:02.802 "uuid": "8096cb55-2bbc-4703-bf8a-d5494ce7befb", 00:12:02.802 "strip_size_kb": 64, 00:12:02.802 "state": "online", 00:12:02.802 "raid_level": "concat", 00:12:02.803 "superblock": true, 00:12:02.803 "num_base_bdevs": 2, 00:12:02.803 "num_base_bdevs_discovered": 2, 00:12:02.803 "num_base_bdevs_operational": 2, 00:12:02.803 "base_bdevs_list": [ 00:12:02.803 { 00:12:02.803 "name": "pt1", 00:12:02.803 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:02.803 "is_configured": true, 00:12:02.803 "data_offset": 2048, 00:12:02.803 "data_size": 63488 00:12:02.803 }, 00:12:02.803 { 00:12:02.803 "name": "pt2", 00:12:02.803 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:02.803 "is_configured": true, 00:12:02.803 "data_offset": 2048, 00:12:02.803 "data_size": 63488 00:12:02.803 } 00:12:02.803 ] 00:12:02.803 }' 00:12:02.803 10:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.803 10:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.369 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:03.369 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:03.369 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:03.369 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:03.369 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:03.369 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:03.369 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:03.369 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:03.369 [2024-07-12 10:38:38.542506] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:03.635 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:03.635 "name": "raid_bdev1", 00:12:03.635 "aliases": [ 00:12:03.635 "8096cb55-2bbc-4703-bf8a-d5494ce7befb" 00:12:03.635 ], 00:12:03.635 "product_name": "Raid Volume", 00:12:03.635 "block_size": 512, 00:12:03.635 "num_blocks": 126976, 00:12:03.635 "uuid": "8096cb55-2bbc-4703-bf8a-d5494ce7befb", 00:12:03.635 "assigned_rate_limits": { 00:12:03.635 "rw_ios_per_sec": 0, 00:12:03.635 "rw_mbytes_per_sec": 0, 00:12:03.635 "r_mbytes_per_sec": 0, 00:12:03.635 "w_mbytes_per_sec": 0 00:12:03.635 }, 00:12:03.635 "claimed": false, 00:12:03.635 "zoned": false, 00:12:03.635 "supported_io_types": { 00:12:03.635 "read": true, 00:12:03.635 "write": true, 00:12:03.635 "unmap": true, 00:12:03.635 "flush": true, 00:12:03.635 "reset": true, 00:12:03.635 "nvme_admin": false, 00:12:03.635 "nvme_io": false, 00:12:03.635 "nvme_io_md": false, 00:12:03.635 "write_zeroes": true, 00:12:03.635 "zcopy": false, 00:12:03.635 "get_zone_info": false, 00:12:03.635 "zone_management": false, 00:12:03.635 "zone_append": false, 00:12:03.635 "compare": false, 00:12:03.635 "compare_and_write": false, 00:12:03.635 "abort": false, 00:12:03.635 "seek_hole": false, 00:12:03.635 "seek_data": false, 00:12:03.635 "copy": false, 00:12:03.635 "nvme_iov_md": false 00:12:03.635 }, 00:12:03.635 "memory_domains": [ 00:12:03.635 { 00:12:03.635 "dma_device_id": "system", 00:12:03.635 "dma_device_type": 1 00:12:03.635 }, 00:12:03.635 { 00:12:03.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.635 "dma_device_type": 2 00:12:03.635 }, 00:12:03.635 { 00:12:03.635 "dma_device_id": "system", 00:12:03.635 "dma_device_type": 1 00:12:03.635 }, 00:12:03.635 { 00:12:03.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.635 "dma_device_type": 2 00:12:03.635 } 00:12:03.635 ], 00:12:03.635 "driver_specific": { 00:12:03.635 "raid": { 00:12:03.635 "uuid": "8096cb55-2bbc-4703-bf8a-d5494ce7befb", 00:12:03.635 "strip_size_kb": 64, 00:12:03.635 "state": "online", 00:12:03.635 "raid_level": "concat", 00:12:03.635 "superblock": true, 00:12:03.635 "num_base_bdevs": 2, 00:12:03.635 "num_base_bdevs_discovered": 2, 00:12:03.635 "num_base_bdevs_operational": 2, 00:12:03.635 "base_bdevs_list": [ 00:12:03.635 { 00:12:03.635 "name": "pt1", 00:12:03.635 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:03.635 "is_configured": true, 00:12:03.635 "data_offset": 2048, 00:12:03.635 "data_size": 63488 00:12:03.635 }, 00:12:03.635 { 00:12:03.635 "name": "pt2", 00:12:03.635 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:03.635 "is_configured": true, 00:12:03.635 "data_offset": 2048, 00:12:03.635 "data_size": 63488 00:12:03.635 } 00:12:03.635 ] 00:12:03.635 } 00:12:03.635 } 00:12:03.635 }' 00:12:03.635 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:03.635 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:03.635 pt2' 00:12:03.635 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.635 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:03.635 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.895 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.895 "name": "pt1", 00:12:03.895 "aliases": [ 00:12:03.895 "00000000-0000-0000-0000-000000000001" 00:12:03.895 ], 00:12:03.895 "product_name": "passthru", 00:12:03.895 "block_size": 512, 00:12:03.895 "num_blocks": 65536, 00:12:03.895 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:03.895 "assigned_rate_limits": { 00:12:03.895 "rw_ios_per_sec": 0, 00:12:03.895 "rw_mbytes_per_sec": 0, 00:12:03.895 "r_mbytes_per_sec": 0, 00:12:03.895 "w_mbytes_per_sec": 0 00:12:03.895 }, 00:12:03.895 "claimed": true, 00:12:03.895 "claim_type": "exclusive_write", 00:12:03.895 "zoned": false, 00:12:03.895 "supported_io_types": { 00:12:03.895 "read": true, 00:12:03.895 "write": true, 00:12:03.895 "unmap": true, 00:12:03.895 "flush": true, 00:12:03.895 "reset": true, 00:12:03.895 "nvme_admin": false, 00:12:03.895 "nvme_io": false, 00:12:03.895 "nvme_io_md": false, 00:12:03.895 "write_zeroes": true, 00:12:03.895 "zcopy": true, 00:12:03.895 "get_zone_info": false, 00:12:03.895 "zone_management": false, 00:12:03.895 "zone_append": false, 00:12:03.895 "compare": false, 00:12:03.895 "compare_and_write": false, 00:12:03.895 "abort": true, 00:12:03.895 "seek_hole": false, 00:12:03.895 "seek_data": false, 00:12:03.895 "copy": true, 00:12:03.895 "nvme_iov_md": false 00:12:03.895 }, 00:12:03.895 "memory_domains": [ 00:12:03.895 { 00:12:03.895 "dma_device_id": "system", 00:12:03.895 "dma_device_type": 1 00:12:03.895 }, 00:12:03.895 { 00:12:03.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.895 "dma_device_type": 2 00:12:03.895 } 00:12:03.895 ], 00:12:03.895 "driver_specific": { 00:12:03.895 "passthru": { 00:12:03.895 "name": "pt1", 00:12:03.895 "base_bdev_name": "malloc1" 00:12:03.895 } 00:12:03.895 } 00:12:03.895 }' 00:12:03.895 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.895 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.895 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.895 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.895 10:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.895 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.895 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.895 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.153 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.153 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.153 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.153 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.153 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.153 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:04.153 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.411 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.411 "name": "pt2", 00:12:04.411 "aliases": [ 00:12:04.411 "00000000-0000-0000-0000-000000000002" 00:12:04.411 ], 00:12:04.411 "product_name": "passthru", 00:12:04.411 "block_size": 512, 00:12:04.411 "num_blocks": 65536, 00:12:04.411 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:04.411 "assigned_rate_limits": { 00:12:04.411 "rw_ios_per_sec": 0, 00:12:04.411 "rw_mbytes_per_sec": 0, 00:12:04.411 "r_mbytes_per_sec": 0, 00:12:04.411 "w_mbytes_per_sec": 0 00:12:04.411 }, 00:12:04.411 "claimed": true, 00:12:04.411 "claim_type": "exclusive_write", 00:12:04.411 "zoned": false, 00:12:04.411 "supported_io_types": { 00:12:04.411 "read": true, 00:12:04.411 "write": true, 00:12:04.411 "unmap": true, 00:12:04.411 "flush": true, 00:12:04.411 "reset": true, 00:12:04.411 "nvme_admin": false, 00:12:04.411 "nvme_io": false, 00:12:04.411 "nvme_io_md": false, 00:12:04.411 "write_zeroes": true, 00:12:04.411 "zcopy": true, 00:12:04.411 "get_zone_info": false, 00:12:04.411 "zone_management": false, 00:12:04.411 "zone_append": false, 00:12:04.411 "compare": false, 00:12:04.411 "compare_and_write": false, 00:12:04.412 "abort": true, 00:12:04.412 "seek_hole": false, 00:12:04.412 "seek_data": false, 00:12:04.412 "copy": true, 00:12:04.412 "nvme_iov_md": false 00:12:04.412 }, 00:12:04.412 "memory_domains": [ 00:12:04.412 { 00:12:04.412 "dma_device_id": "system", 00:12:04.412 "dma_device_type": 1 00:12:04.412 }, 00:12:04.412 { 00:12:04.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.412 "dma_device_type": 2 00:12:04.412 } 00:12:04.412 ], 00:12:04.412 "driver_specific": { 00:12:04.412 "passthru": { 00:12:04.412 "name": "pt2", 00:12:04.412 "base_bdev_name": "malloc2" 00:12:04.412 } 00:12:04.412 } 00:12:04.412 }' 00:12:04.412 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.412 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.412 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.412 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.412 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.412 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.412 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.669 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.669 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.669 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.669 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.669 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.669 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:04.669 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:04.928 [2024-07-12 10:38:39.906132] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8096cb55-2bbc-4703-bf8a-d5494ce7befb '!=' 8096cb55-2bbc-4703-bf8a-d5494ce7befb ']' 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2024468 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2024468 ']' 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2024468 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2024468 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2024468' 00:12:04.928 killing process with pid 2024468 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2024468 00:12:04.928 [2024-07-12 10:38:39.980178] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:04.928 [2024-07-12 10:38:39.980237] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:04.928 [2024-07-12 10:38:39.980278] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:04.928 [2024-07-12 10:38:39.980290] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2450ec0 name raid_bdev1, state offline 00:12:04.928 10:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2024468 00:12:04.928 [2024-07-12 10:38:39.998102] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:05.186 10:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:05.186 00:12:05.186 real 0m10.314s 00:12:05.186 user 0m18.399s 00:12:05.186 sys 0m1.915s 00:12:05.186 10:38:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:05.186 10:38:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.186 ************************************ 00:12:05.186 END TEST raid_superblock_test 00:12:05.186 ************************************ 00:12:05.186 10:38:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:05.186 10:38:40 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:05.186 10:38:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:05.186 10:38:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:05.186 10:38:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:05.186 ************************************ 00:12:05.186 START TEST raid_read_error_test 00:12:05.186 ************************************ 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.45klFQ4cua 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2025986 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2025986 /var/tmp/spdk-raid.sock 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2025986 ']' 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:05.186 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:05.186 10:38:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.186 [2024-07-12 10:38:40.371403] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:12:05.186 [2024-07-12 10:38:40.371471] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2025986 ] 00:12:05.443 [2024-07-12 10:38:40.489221] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:05.443 [2024-07-12 10:38:40.591892] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:05.700 [2024-07-12 10:38:40.652585] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:05.700 [2024-07-12 10:38:40.652618] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:06.274 10:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:06.274 10:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:06.274 10:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:06.274 10:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:06.531 BaseBdev1_malloc 00:12:06.531 10:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:06.788 true 00:12:06.788 10:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:07.046 [2024-07-12 10:38:42.021424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:07.046 [2024-07-12 10:38:42.021467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:07.046 [2024-07-12 10:38:42.021492] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15980d0 00:12:07.046 [2024-07-12 10:38:42.021506] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:07.046 [2024-07-12 10:38:42.023320] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:07.046 [2024-07-12 10:38:42.023351] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:07.046 BaseBdev1 00:12:07.046 10:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:07.046 10:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:07.303 BaseBdev2_malloc 00:12:07.303 10:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:07.560 true 00:12:07.560 10:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:07.818 [2024-07-12 10:38:42.760003] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:07.818 [2024-07-12 10:38:42.760042] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:07.818 [2024-07-12 10:38:42.760062] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x159c910 00:12:07.818 [2024-07-12 10:38:42.760074] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:07.818 [2024-07-12 10:38:42.761501] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:07.818 [2024-07-12 10:38:42.761531] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:07.818 BaseBdev2 00:12:07.818 10:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:07.818 [2024-07-12 10:38:43.000666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:07.818 [2024-07-12 10:38:43.001900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:07.818 [2024-07-12 10:38:43.002080] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x159e320 00:12:07.818 [2024-07-12 10:38:43.002093] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:07.818 [2024-07-12 10:38:43.002272] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x159f290 00:12:07.818 [2024-07-12 10:38:43.002413] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x159e320 00:12:07.818 [2024-07-12 10:38:43.002424] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x159e320 00:12:07.818 [2024-07-12 10:38:43.002530] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:08.075 "name": "raid_bdev1", 00:12:08.075 "uuid": "e64cee39-3c0e-4cbc-90b2-bf0098771d30", 00:12:08.075 "strip_size_kb": 64, 00:12:08.075 "state": "online", 00:12:08.075 "raid_level": "concat", 00:12:08.075 "superblock": true, 00:12:08.075 "num_base_bdevs": 2, 00:12:08.075 "num_base_bdevs_discovered": 2, 00:12:08.075 "num_base_bdevs_operational": 2, 00:12:08.075 "base_bdevs_list": [ 00:12:08.075 { 00:12:08.075 "name": "BaseBdev1", 00:12:08.075 "uuid": "9dcd3a9a-1586-5174-9cdd-da649e969505", 00:12:08.075 "is_configured": true, 00:12:08.075 "data_offset": 2048, 00:12:08.075 "data_size": 63488 00:12:08.075 }, 00:12:08.075 { 00:12:08.075 "name": "BaseBdev2", 00:12:08.075 "uuid": "eb5623f1-2595-5f92-84c9-012a41c37c1b", 00:12:08.075 "is_configured": true, 00:12:08.075 "data_offset": 2048, 00:12:08.075 "data_size": 63488 00:12:08.075 } 00:12:08.075 ] 00:12:08.075 }' 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:08.075 10:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.008 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:09.008 10:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:09.008 [2024-07-12 10:38:43.919373] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15999b0 00:12:09.943 10:38:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:09.943 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.201 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:10.201 "name": "raid_bdev1", 00:12:10.201 "uuid": "e64cee39-3c0e-4cbc-90b2-bf0098771d30", 00:12:10.201 "strip_size_kb": 64, 00:12:10.201 "state": "online", 00:12:10.201 "raid_level": "concat", 00:12:10.201 "superblock": true, 00:12:10.201 "num_base_bdevs": 2, 00:12:10.201 "num_base_bdevs_discovered": 2, 00:12:10.201 "num_base_bdevs_operational": 2, 00:12:10.201 "base_bdevs_list": [ 00:12:10.201 { 00:12:10.201 "name": "BaseBdev1", 00:12:10.201 "uuid": "9dcd3a9a-1586-5174-9cdd-da649e969505", 00:12:10.201 "is_configured": true, 00:12:10.201 "data_offset": 2048, 00:12:10.201 "data_size": 63488 00:12:10.201 }, 00:12:10.201 { 00:12:10.201 "name": "BaseBdev2", 00:12:10.202 "uuid": "eb5623f1-2595-5f92-84c9-012a41c37c1b", 00:12:10.202 "is_configured": true, 00:12:10.202 "data_offset": 2048, 00:12:10.202 "data_size": 63488 00:12:10.202 } 00:12:10.202 ] 00:12:10.202 }' 00:12:10.202 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:10.202 10:38:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.768 10:38:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:11.028 [2024-07-12 10:38:46.155401] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:11.028 [2024-07-12 10:38:46.155433] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:11.028 [2024-07-12 10:38:46.158604] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:11.028 [2024-07-12 10:38:46.158635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:11.028 [2024-07-12 10:38:46.158663] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:11.028 [2024-07-12 10:38:46.158674] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x159e320 name raid_bdev1, state offline 00:12:11.028 0 00:12:11.028 10:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2025986 00:12:11.028 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2025986 ']' 00:12:11.028 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2025986 00:12:11.028 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:11.028 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:11.028 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2025986 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2025986' 00:12:11.287 killing process with pid 2025986 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2025986 00:12:11.287 [2024-07-12 10:38:46.225703] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2025986 00:12:11.287 [2024-07-12 10:38:46.236390] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.45klFQ4cua 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:12:11.287 00:12:11.287 real 0m6.175s 00:12:11.287 user 0m9.625s 00:12:11.287 sys 0m1.058s 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:11.287 10:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.287 ************************************ 00:12:11.287 END TEST raid_read_error_test 00:12:11.287 ************************************ 00:12:11.547 10:38:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:11.547 10:38:46 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:12:11.547 10:38:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:11.547 10:38:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:11.547 10:38:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:11.547 ************************************ 00:12:11.547 START TEST raid_write_error_test 00:12:11.547 ************************************ 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.aN4uVhgU7E 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2026955 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2026955 /var/tmp/spdk-raid.sock 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2026955 ']' 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:11.547 10:38:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:11.548 10:38:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:11.548 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:11.548 10:38:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:11.548 10:38:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.548 [2024-07-12 10:38:46.625779] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:12:11.548 [2024-07-12 10:38:46.625838] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2026955 ] 00:12:11.548 [2024-07-12 10:38:46.737032] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:11.807 [2024-07-12 10:38:46.841030] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:11.807 [2024-07-12 10:38:46.899060] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:11.807 [2024-07-12 10:38:46.899086] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:12.742 10:38:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:12.743 10:38:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:12.743 10:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:12.743 10:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:12.743 BaseBdev1_malloc 00:12:12.743 10:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:13.048 true 00:12:13.048 10:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:13.307 [2024-07-12 10:38:48.227056] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:13.307 [2024-07-12 10:38:48.227103] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:13.307 [2024-07-12 10:38:48.227124] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25120d0 00:12:13.307 [2024-07-12 10:38:48.227137] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:13.307 [2024-07-12 10:38:48.228954] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:13.307 [2024-07-12 10:38:48.228988] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:13.307 BaseBdev1 00:12:13.307 10:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:13.307 10:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:13.307 BaseBdev2_malloc 00:12:13.307 10:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:13.565 true 00:12:13.565 10:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:13.823 [2024-07-12 10:38:48.965595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:13.823 [2024-07-12 10:38:48.965639] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:13.823 [2024-07-12 10:38:48.965659] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2516910 00:12:13.823 [2024-07-12 10:38:48.965671] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:13.823 [2024-07-12 10:38:48.967048] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:13.823 [2024-07-12 10:38:48.967075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:13.823 BaseBdev2 00:12:13.823 10:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:14.081 [2024-07-12 10:38:49.210259] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:14.081 [2024-07-12 10:38:49.211436] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:14.081 [2024-07-12 10:38:49.211621] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2518320 00:12:14.081 [2024-07-12 10:38:49.211634] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:14.081 [2024-07-12 10:38:49.211806] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2519290 00:12:14.081 [2024-07-12 10:38:49.211943] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2518320 00:12:14.081 [2024-07-12 10:38:49.211953] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2518320 00:12:14.081 [2024-07-12 10:38:49.212048] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.081 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:14.339 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.339 "name": "raid_bdev1", 00:12:14.339 "uuid": "5aae9e98-8d5f-46b0-8bd1-3c7e760c5bde", 00:12:14.339 "strip_size_kb": 64, 00:12:14.339 "state": "online", 00:12:14.339 "raid_level": "concat", 00:12:14.339 "superblock": true, 00:12:14.339 "num_base_bdevs": 2, 00:12:14.339 "num_base_bdevs_discovered": 2, 00:12:14.339 "num_base_bdevs_operational": 2, 00:12:14.339 "base_bdevs_list": [ 00:12:14.339 { 00:12:14.339 "name": "BaseBdev1", 00:12:14.339 "uuid": "edc1220f-8030-59dd-b7c3-822178c4d7b1", 00:12:14.339 "is_configured": true, 00:12:14.339 "data_offset": 2048, 00:12:14.339 "data_size": 63488 00:12:14.339 }, 00:12:14.339 { 00:12:14.339 "name": "BaseBdev2", 00:12:14.339 "uuid": "cc88ae91-7fe2-5f04-8872-277c393ffa9a", 00:12:14.339 "is_configured": true, 00:12:14.339 "data_offset": 2048, 00:12:14.339 "data_size": 63488 00:12:14.339 } 00:12:14.339 ] 00:12:14.339 }' 00:12:14.339 10:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.339 10:38:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:14.903 10:38:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:14.903 10:38:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:15.161 [2024-07-12 10:38:50.157079] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25139b0 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.096 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:16.354 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:16.354 "name": "raid_bdev1", 00:12:16.354 "uuid": "5aae9e98-8d5f-46b0-8bd1-3c7e760c5bde", 00:12:16.354 "strip_size_kb": 64, 00:12:16.354 "state": "online", 00:12:16.354 "raid_level": "concat", 00:12:16.354 "superblock": true, 00:12:16.354 "num_base_bdevs": 2, 00:12:16.354 "num_base_bdevs_discovered": 2, 00:12:16.354 "num_base_bdevs_operational": 2, 00:12:16.354 "base_bdevs_list": [ 00:12:16.354 { 00:12:16.354 "name": "BaseBdev1", 00:12:16.354 "uuid": "edc1220f-8030-59dd-b7c3-822178c4d7b1", 00:12:16.354 "is_configured": true, 00:12:16.354 "data_offset": 2048, 00:12:16.354 "data_size": 63488 00:12:16.354 }, 00:12:16.354 { 00:12:16.354 "name": "BaseBdev2", 00:12:16.354 "uuid": "cc88ae91-7fe2-5f04-8872-277c393ffa9a", 00:12:16.354 "is_configured": true, 00:12:16.354 "data_offset": 2048, 00:12:16.354 "data_size": 63488 00:12:16.354 } 00:12:16.354 ] 00:12:16.354 }' 00:12:16.354 10:38:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:16.354 10:38:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.920 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:17.179 [2024-07-12 10:38:52.188464] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:17.179 [2024-07-12 10:38:52.188514] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:17.179 [2024-07-12 10:38:52.191683] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:17.179 [2024-07-12 10:38:52.191736] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:17.179 [2024-07-12 10:38:52.191764] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:17.179 [2024-07-12 10:38:52.191774] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2518320 name raid_bdev1, state offline 00:12:17.179 0 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2026955 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2026955 ']' 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2026955 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2026955 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2026955' 00:12:17.179 killing process with pid 2026955 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2026955 00:12:17.179 [2024-07-12 10:38:52.257201] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:17.179 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2026955 00:12:17.179 [2024-07-12 10:38:52.267847] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.aN4uVhgU7E 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:12:17.438 00:12:17.438 real 0m5.953s 00:12:17.438 user 0m9.192s 00:12:17.438 sys 0m1.064s 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:17.438 10:38:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.438 ************************************ 00:12:17.438 END TEST raid_write_error_test 00:12:17.438 ************************************ 00:12:17.438 10:38:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:17.438 10:38:52 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:17.438 10:38:52 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:17.438 10:38:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:17.438 10:38:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:17.438 10:38:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:17.438 ************************************ 00:12:17.438 START TEST raid_state_function_test 00:12:17.438 ************************************ 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2027768 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2027768' 00:12:17.438 Process raid pid: 2027768 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2027768 /var/tmp/spdk-raid.sock 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2027768 ']' 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:17.438 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:17.438 10:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.438 [2024-07-12 10:38:52.622673] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:12:17.438 [2024-07-12 10:38:52.622717] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:17.697 [2024-07-12 10:38:52.736841] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:17.697 [2024-07-12 10:38:52.843825] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:17.956 [2024-07-12 10:38:52.906114] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:17.956 [2024-07-12 10:38:52.906142] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:17.956 10:38:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:17.956 10:38:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:17.956 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:18.213 [2024-07-12 10:38:53.333430] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:18.213 [2024-07-12 10:38:53.333471] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:18.213 [2024-07-12 10:38:53.333487] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:18.213 [2024-07-12 10:38:53.333499] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.213 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.471 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.471 "name": "Existed_Raid", 00:12:18.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.471 "strip_size_kb": 0, 00:12:18.471 "state": "configuring", 00:12:18.471 "raid_level": "raid1", 00:12:18.471 "superblock": false, 00:12:18.471 "num_base_bdevs": 2, 00:12:18.471 "num_base_bdevs_discovered": 0, 00:12:18.471 "num_base_bdevs_operational": 2, 00:12:18.471 "base_bdevs_list": [ 00:12:18.471 { 00:12:18.471 "name": "BaseBdev1", 00:12:18.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.471 "is_configured": false, 00:12:18.471 "data_offset": 0, 00:12:18.471 "data_size": 0 00:12:18.471 }, 00:12:18.471 { 00:12:18.471 "name": "BaseBdev2", 00:12:18.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.471 "is_configured": false, 00:12:18.471 "data_offset": 0, 00:12:18.471 "data_size": 0 00:12:18.471 } 00:12:18.471 ] 00:12:18.471 }' 00:12:18.471 10:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.471 10:38:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.037 10:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:19.296 [2024-07-12 10:38:54.412168] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:19.296 [2024-07-12 10:38:54.412193] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116ea80 name Existed_Raid, state configuring 00:12:19.297 10:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:19.556 [2024-07-12 10:38:54.588644] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:19.556 [2024-07-12 10:38:54.588671] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:19.556 [2024-07-12 10:38:54.588681] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:19.556 [2024-07-12 10:38:54.588692] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:19.556 10:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:19.815 [2024-07-12 10:38:54.779039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:19.815 BaseBdev1 00:12:19.815 10:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:19.815 10:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:19.815 10:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:19.815 10:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:19.815 10:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:19.815 10:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:19.815 10:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:19.815 10:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:20.074 [ 00:12:20.074 { 00:12:20.074 "name": "BaseBdev1", 00:12:20.074 "aliases": [ 00:12:20.074 "ed39ceac-53eb-46e6-8870-630ceb3f71e9" 00:12:20.074 ], 00:12:20.074 "product_name": "Malloc disk", 00:12:20.074 "block_size": 512, 00:12:20.074 "num_blocks": 65536, 00:12:20.074 "uuid": "ed39ceac-53eb-46e6-8870-630ceb3f71e9", 00:12:20.074 "assigned_rate_limits": { 00:12:20.074 "rw_ios_per_sec": 0, 00:12:20.074 "rw_mbytes_per_sec": 0, 00:12:20.074 "r_mbytes_per_sec": 0, 00:12:20.074 "w_mbytes_per_sec": 0 00:12:20.074 }, 00:12:20.074 "claimed": true, 00:12:20.074 "claim_type": "exclusive_write", 00:12:20.074 "zoned": false, 00:12:20.074 "supported_io_types": { 00:12:20.074 "read": true, 00:12:20.074 "write": true, 00:12:20.074 "unmap": true, 00:12:20.074 "flush": true, 00:12:20.074 "reset": true, 00:12:20.074 "nvme_admin": false, 00:12:20.074 "nvme_io": false, 00:12:20.074 "nvme_io_md": false, 00:12:20.074 "write_zeroes": true, 00:12:20.074 "zcopy": true, 00:12:20.074 "get_zone_info": false, 00:12:20.074 "zone_management": false, 00:12:20.074 "zone_append": false, 00:12:20.074 "compare": false, 00:12:20.074 "compare_and_write": false, 00:12:20.074 "abort": true, 00:12:20.074 "seek_hole": false, 00:12:20.074 "seek_data": false, 00:12:20.074 "copy": true, 00:12:20.074 "nvme_iov_md": false 00:12:20.074 }, 00:12:20.074 "memory_domains": [ 00:12:20.074 { 00:12:20.074 "dma_device_id": "system", 00:12:20.074 "dma_device_type": 1 00:12:20.074 }, 00:12:20.074 { 00:12:20.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.074 "dma_device_type": 2 00:12:20.074 } 00:12:20.074 ], 00:12:20.074 "driver_specific": {} 00:12:20.074 } 00:12:20.074 ] 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.074 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.333 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.333 "name": "Existed_Raid", 00:12:20.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.333 "strip_size_kb": 0, 00:12:20.333 "state": "configuring", 00:12:20.333 "raid_level": "raid1", 00:12:20.333 "superblock": false, 00:12:20.333 "num_base_bdevs": 2, 00:12:20.333 "num_base_bdevs_discovered": 1, 00:12:20.333 "num_base_bdevs_operational": 2, 00:12:20.333 "base_bdevs_list": [ 00:12:20.333 { 00:12:20.333 "name": "BaseBdev1", 00:12:20.333 "uuid": "ed39ceac-53eb-46e6-8870-630ceb3f71e9", 00:12:20.333 "is_configured": true, 00:12:20.333 "data_offset": 0, 00:12:20.333 "data_size": 65536 00:12:20.333 }, 00:12:20.333 { 00:12:20.333 "name": "BaseBdev2", 00:12:20.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.333 "is_configured": false, 00:12:20.333 "data_offset": 0, 00:12:20.333 "data_size": 0 00:12:20.333 } 00:12:20.333 ] 00:12:20.333 }' 00:12:20.333 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.333 10:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.900 10:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:21.159 [2024-07-12 10:38:56.150696] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:21.159 [2024-07-12 10:38:56.150733] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116e350 name Existed_Raid, state configuring 00:12:21.159 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:21.417 [2024-07-12 10:38:56.399374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:21.417 [2024-07-12 10:38:56.400850] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:21.417 [2024-07-12 10:38:56.400883] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:21.417 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.676 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:21.676 "name": "Existed_Raid", 00:12:21.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:21.676 "strip_size_kb": 0, 00:12:21.676 "state": "configuring", 00:12:21.676 "raid_level": "raid1", 00:12:21.676 "superblock": false, 00:12:21.676 "num_base_bdevs": 2, 00:12:21.676 "num_base_bdevs_discovered": 1, 00:12:21.676 "num_base_bdevs_operational": 2, 00:12:21.676 "base_bdevs_list": [ 00:12:21.676 { 00:12:21.676 "name": "BaseBdev1", 00:12:21.676 "uuid": "ed39ceac-53eb-46e6-8870-630ceb3f71e9", 00:12:21.676 "is_configured": true, 00:12:21.676 "data_offset": 0, 00:12:21.676 "data_size": 65536 00:12:21.676 }, 00:12:21.676 { 00:12:21.676 "name": "BaseBdev2", 00:12:21.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:21.676 "is_configured": false, 00:12:21.676 "data_offset": 0, 00:12:21.676 "data_size": 0 00:12:21.676 } 00:12:21.676 ] 00:12:21.676 }' 00:12:21.676 10:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:21.676 10:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.243 10:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:22.501 [2024-07-12 10:38:57.509679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:22.501 [2024-07-12 10:38:57.509717] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x116f000 00:12:22.501 [2024-07-12 10:38:57.509725] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:22.501 [2024-07-12 10:38:57.509917] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10890c0 00:12:22.501 [2024-07-12 10:38:57.510034] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116f000 00:12:22.501 [2024-07-12 10:38:57.510044] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x116f000 00:12:22.501 [2024-07-12 10:38:57.510203] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:22.501 BaseBdev2 00:12:22.501 10:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:22.501 10:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:22.501 10:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:22.501 10:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:22.501 10:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:22.501 10:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:22.501 10:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:22.758 10:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:23.016 [ 00:12:23.016 { 00:12:23.016 "name": "BaseBdev2", 00:12:23.016 "aliases": [ 00:12:23.016 "f1e1bf4d-aa3e-4a12-b121-dba0f5361173" 00:12:23.016 ], 00:12:23.016 "product_name": "Malloc disk", 00:12:23.016 "block_size": 512, 00:12:23.016 "num_blocks": 65536, 00:12:23.016 "uuid": "f1e1bf4d-aa3e-4a12-b121-dba0f5361173", 00:12:23.016 "assigned_rate_limits": { 00:12:23.016 "rw_ios_per_sec": 0, 00:12:23.016 "rw_mbytes_per_sec": 0, 00:12:23.016 "r_mbytes_per_sec": 0, 00:12:23.016 "w_mbytes_per_sec": 0 00:12:23.016 }, 00:12:23.016 "claimed": true, 00:12:23.016 "claim_type": "exclusive_write", 00:12:23.016 "zoned": false, 00:12:23.016 "supported_io_types": { 00:12:23.016 "read": true, 00:12:23.016 "write": true, 00:12:23.016 "unmap": true, 00:12:23.016 "flush": true, 00:12:23.016 "reset": true, 00:12:23.016 "nvme_admin": false, 00:12:23.016 "nvme_io": false, 00:12:23.016 "nvme_io_md": false, 00:12:23.016 "write_zeroes": true, 00:12:23.016 "zcopy": true, 00:12:23.016 "get_zone_info": false, 00:12:23.016 "zone_management": false, 00:12:23.016 "zone_append": false, 00:12:23.016 "compare": false, 00:12:23.016 "compare_and_write": false, 00:12:23.016 "abort": true, 00:12:23.016 "seek_hole": false, 00:12:23.016 "seek_data": false, 00:12:23.016 "copy": true, 00:12:23.016 "nvme_iov_md": false 00:12:23.016 }, 00:12:23.016 "memory_domains": [ 00:12:23.016 { 00:12:23.016 "dma_device_id": "system", 00:12:23.016 "dma_device_type": 1 00:12:23.016 }, 00:12:23.016 { 00:12:23.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.016 "dma_device_type": 2 00:12:23.016 } 00:12:23.016 ], 00:12:23.016 "driver_specific": {} 00:12:23.016 } 00:12:23.016 ] 00:12:23.016 10:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:23.016 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:23.016 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:23.016 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:23.016 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.017 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:23.274 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:23.274 "name": "Existed_Raid", 00:12:23.274 "uuid": "9dbb944f-3eae-4620-a959-cfa14bad2c4c", 00:12:23.274 "strip_size_kb": 0, 00:12:23.274 "state": "online", 00:12:23.274 "raid_level": "raid1", 00:12:23.274 "superblock": false, 00:12:23.274 "num_base_bdevs": 2, 00:12:23.274 "num_base_bdevs_discovered": 2, 00:12:23.274 "num_base_bdevs_operational": 2, 00:12:23.274 "base_bdevs_list": [ 00:12:23.274 { 00:12:23.274 "name": "BaseBdev1", 00:12:23.274 "uuid": "ed39ceac-53eb-46e6-8870-630ceb3f71e9", 00:12:23.274 "is_configured": true, 00:12:23.274 "data_offset": 0, 00:12:23.274 "data_size": 65536 00:12:23.274 }, 00:12:23.274 { 00:12:23.274 "name": "BaseBdev2", 00:12:23.274 "uuid": "f1e1bf4d-aa3e-4a12-b121-dba0f5361173", 00:12:23.274 "is_configured": true, 00:12:23.274 "data_offset": 0, 00:12:23.274 "data_size": 65536 00:12:23.274 } 00:12:23.274 ] 00:12:23.274 }' 00:12:23.274 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:23.274 10:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.840 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:23.840 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:23.840 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:23.840 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:23.840 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:23.840 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:23.840 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:23.840 10:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:24.097 [2024-07-12 10:38:59.082127] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:24.097 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:24.097 "name": "Existed_Raid", 00:12:24.097 "aliases": [ 00:12:24.097 "9dbb944f-3eae-4620-a959-cfa14bad2c4c" 00:12:24.097 ], 00:12:24.097 "product_name": "Raid Volume", 00:12:24.097 "block_size": 512, 00:12:24.097 "num_blocks": 65536, 00:12:24.097 "uuid": "9dbb944f-3eae-4620-a959-cfa14bad2c4c", 00:12:24.097 "assigned_rate_limits": { 00:12:24.097 "rw_ios_per_sec": 0, 00:12:24.097 "rw_mbytes_per_sec": 0, 00:12:24.097 "r_mbytes_per_sec": 0, 00:12:24.097 "w_mbytes_per_sec": 0 00:12:24.097 }, 00:12:24.097 "claimed": false, 00:12:24.097 "zoned": false, 00:12:24.097 "supported_io_types": { 00:12:24.097 "read": true, 00:12:24.097 "write": true, 00:12:24.097 "unmap": false, 00:12:24.097 "flush": false, 00:12:24.097 "reset": true, 00:12:24.097 "nvme_admin": false, 00:12:24.097 "nvme_io": false, 00:12:24.097 "nvme_io_md": false, 00:12:24.097 "write_zeroes": true, 00:12:24.097 "zcopy": false, 00:12:24.097 "get_zone_info": false, 00:12:24.097 "zone_management": false, 00:12:24.097 "zone_append": false, 00:12:24.097 "compare": false, 00:12:24.097 "compare_and_write": false, 00:12:24.097 "abort": false, 00:12:24.097 "seek_hole": false, 00:12:24.097 "seek_data": false, 00:12:24.097 "copy": false, 00:12:24.097 "nvme_iov_md": false 00:12:24.097 }, 00:12:24.097 "memory_domains": [ 00:12:24.097 { 00:12:24.097 "dma_device_id": "system", 00:12:24.097 "dma_device_type": 1 00:12:24.097 }, 00:12:24.097 { 00:12:24.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.097 "dma_device_type": 2 00:12:24.097 }, 00:12:24.097 { 00:12:24.097 "dma_device_id": "system", 00:12:24.097 "dma_device_type": 1 00:12:24.097 }, 00:12:24.097 { 00:12:24.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.097 "dma_device_type": 2 00:12:24.097 } 00:12:24.097 ], 00:12:24.097 "driver_specific": { 00:12:24.097 "raid": { 00:12:24.097 "uuid": "9dbb944f-3eae-4620-a959-cfa14bad2c4c", 00:12:24.097 "strip_size_kb": 0, 00:12:24.097 "state": "online", 00:12:24.097 "raid_level": "raid1", 00:12:24.098 "superblock": false, 00:12:24.098 "num_base_bdevs": 2, 00:12:24.098 "num_base_bdevs_discovered": 2, 00:12:24.098 "num_base_bdevs_operational": 2, 00:12:24.098 "base_bdevs_list": [ 00:12:24.098 { 00:12:24.098 "name": "BaseBdev1", 00:12:24.098 "uuid": "ed39ceac-53eb-46e6-8870-630ceb3f71e9", 00:12:24.098 "is_configured": true, 00:12:24.098 "data_offset": 0, 00:12:24.098 "data_size": 65536 00:12:24.098 }, 00:12:24.098 { 00:12:24.098 "name": "BaseBdev2", 00:12:24.098 "uuid": "f1e1bf4d-aa3e-4a12-b121-dba0f5361173", 00:12:24.098 "is_configured": true, 00:12:24.098 "data_offset": 0, 00:12:24.098 "data_size": 65536 00:12:24.098 } 00:12:24.098 ] 00:12:24.098 } 00:12:24.098 } 00:12:24.098 }' 00:12:24.098 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:24.098 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:24.098 BaseBdev2' 00:12:24.098 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:24.098 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:24.098 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:24.354 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:24.354 "name": "BaseBdev1", 00:12:24.354 "aliases": [ 00:12:24.354 "ed39ceac-53eb-46e6-8870-630ceb3f71e9" 00:12:24.354 ], 00:12:24.354 "product_name": "Malloc disk", 00:12:24.354 "block_size": 512, 00:12:24.354 "num_blocks": 65536, 00:12:24.354 "uuid": "ed39ceac-53eb-46e6-8870-630ceb3f71e9", 00:12:24.354 "assigned_rate_limits": { 00:12:24.354 "rw_ios_per_sec": 0, 00:12:24.354 "rw_mbytes_per_sec": 0, 00:12:24.354 "r_mbytes_per_sec": 0, 00:12:24.354 "w_mbytes_per_sec": 0 00:12:24.354 }, 00:12:24.354 "claimed": true, 00:12:24.354 "claim_type": "exclusive_write", 00:12:24.354 "zoned": false, 00:12:24.354 "supported_io_types": { 00:12:24.354 "read": true, 00:12:24.354 "write": true, 00:12:24.354 "unmap": true, 00:12:24.354 "flush": true, 00:12:24.354 "reset": true, 00:12:24.354 "nvme_admin": false, 00:12:24.354 "nvme_io": false, 00:12:24.354 "nvme_io_md": false, 00:12:24.354 "write_zeroes": true, 00:12:24.354 "zcopy": true, 00:12:24.354 "get_zone_info": false, 00:12:24.354 "zone_management": false, 00:12:24.354 "zone_append": false, 00:12:24.354 "compare": false, 00:12:24.354 "compare_and_write": false, 00:12:24.354 "abort": true, 00:12:24.354 "seek_hole": false, 00:12:24.354 "seek_data": false, 00:12:24.354 "copy": true, 00:12:24.354 "nvme_iov_md": false 00:12:24.354 }, 00:12:24.354 "memory_domains": [ 00:12:24.354 { 00:12:24.354 "dma_device_id": "system", 00:12:24.354 "dma_device_type": 1 00:12:24.354 }, 00:12:24.354 { 00:12:24.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.354 "dma_device_type": 2 00:12:24.354 } 00:12:24.354 ], 00:12:24.354 "driver_specific": {} 00:12:24.354 }' 00:12:24.354 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:24.354 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:24.354 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:24.354 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:24.354 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:24.612 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:24.871 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:24.871 "name": "BaseBdev2", 00:12:24.871 "aliases": [ 00:12:24.871 "f1e1bf4d-aa3e-4a12-b121-dba0f5361173" 00:12:24.871 ], 00:12:24.871 "product_name": "Malloc disk", 00:12:24.871 "block_size": 512, 00:12:24.871 "num_blocks": 65536, 00:12:24.871 "uuid": "f1e1bf4d-aa3e-4a12-b121-dba0f5361173", 00:12:24.871 "assigned_rate_limits": { 00:12:24.871 "rw_ios_per_sec": 0, 00:12:24.871 "rw_mbytes_per_sec": 0, 00:12:24.871 "r_mbytes_per_sec": 0, 00:12:24.871 "w_mbytes_per_sec": 0 00:12:24.871 }, 00:12:24.871 "claimed": true, 00:12:24.871 "claim_type": "exclusive_write", 00:12:24.871 "zoned": false, 00:12:24.871 "supported_io_types": { 00:12:24.871 "read": true, 00:12:24.871 "write": true, 00:12:24.871 "unmap": true, 00:12:24.871 "flush": true, 00:12:24.871 "reset": true, 00:12:24.871 "nvme_admin": false, 00:12:24.871 "nvme_io": false, 00:12:24.871 "nvme_io_md": false, 00:12:24.871 "write_zeroes": true, 00:12:24.871 "zcopy": true, 00:12:24.871 "get_zone_info": false, 00:12:24.871 "zone_management": false, 00:12:24.871 "zone_append": false, 00:12:24.871 "compare": false, 00:12:24.871 "compare_and_write": false, 00:12:24.871 "abort": true, 00:12:24.871 "seek_hole": false, 00:12:24.871 "seek_data": false, 00:12:24.871 "copy": true, 00:12:24.871 "nvme_iov_md": false 00:12:24.871 }, 00:12:24.871 "memory_domains": [ 00:12:24.871 { 00:12:24.871 "dma_device_id": "system", 00:12:24.871 "dma_device_type": 1 00:12:24.871 }, 00:12:24.871 { 00:12:24.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.871 "dma_device_type": 2 00:12:24.871 } 00:12:24.871 ], 00:12:24.871 "driver_specific": {} 00:12:24.871 }' 00:12:24.871 10:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:24.871 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:25.129 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:25.129 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:25.129 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:25.129 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:25.129 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:25.129 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:25.129 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:25.129 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:25.129 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:25.388 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:25.388 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:25.388 [2024-07-12 10:39:00.553843] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:25.388 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:25.388 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:25.388 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:25.388 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:25.388 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:25.388 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:25.388 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.389 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:25.647 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:25.647 "name": "Existed_Raid", 00:12:25.647 "uuid": "9dbb944f-3eae-4620-a959-cfa14bad2c4c", 00:12:25.647 "strip_size_kb": 0, 00:12:25.647 "state": "online", 00:12:25.647 "raid_level": "raid1", 00:12:25.647 "superblock": false, 00:12:25.647 "num_base_bdevs": 2, 00:12:25.647 "num_base_bdevs_discovered": 1, 00:12:25.647 "num_base_bdevs_operational": 1, 00:12:25.647 "base_bdevs_list": [ 00:12:25.647 { 00:12:25.647 "name": null, 00:12:25.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:25.648 "is_configured": false, 00:12:25.648 "data_offset": 0, 00:12:25.648 "data_size": 65536 00:12:25.648 }, 00:12:25.648 { 00:12:25.648 "name": "BaseBdev2", 00:12:25.648 "uuid": "f1e1bf4d-aa3e-4a12-b121-dba0f5361173", 00:12:25.648 "is_configured": true, 00:12:25.648 "data_offset": 0, 00:12:25.648 "data_size": 65536 00:12:25.648 } 00:12:25.648 ] 00:12:25.648 }' 00:12:25.648 10:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:25.648 10:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.583 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:26.583 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:26.583 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.583 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:26.583 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:26.583 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:26.583 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:26.841 [2024-07-12 10:39:01.838560] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:26.841 [2024-07-12 10:39:01.838638] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:26.841 [2024-07-12 10:39:01.849640] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:26.841 [2024-07-12 10:39:01.849676] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:26.841 [2024-07-12 10:39:01.849687] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116f000 name Existed_Raid, state offline 00:12:26.841 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:26.841 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:26.841 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.841 10:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2027768 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2027768 ']' 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2027768 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2027768 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2027768' 00:12:27.099 killing process with pid 2027768 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2027768 00:12:27.099 [2024-07-12 10:39:02.169578] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:27.099 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2027768 00:12:27.099 [2024-07-12 10:39:02.170554] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:27.370 00:12:27.370 real 0m9.814s 00:12:27.370 user 0m17.793s 00:12:27.370 sys 0m1.937s 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.370 ************************************ 00:12:27.370 END TEST raid_state_function_test 00:12:27.370 ************************************ 00:12:27.370 10:39:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:27.370 10:39:02 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:12:27.370 10:39:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:27.370 10:39:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:27.370 10:39:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:27.370 ************************************ 00:12:27.370 START TEST raid_state_function_test_sb 00:12:27.370 ************************************ 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2029503 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2029503' 00:12:27.370 Process raid pid: 2029503 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2029503 /var/tmp/spdk-raid.sock 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2029503 ']' 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:27.370 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:27.370 10:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:27.370 [2024-07-12 10:39:02.514156] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:12:27.370 [2024-07-12 10:39:02.514201] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:27.647 [2024-07-12 10:39:02.626987] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:27.647 [2024-07-12 10:39:02.734940] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:27.647 [2024-07-12 10:39:02.799707] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:27.647 [2024-07-12 10:39:02.799737] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:28.582 [2024-07-12 10:39:03.646248] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:28.582 [2024-07-12 10:39:03.646290] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:28.582 [2024-07-12 10:39:03.646302] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:28.582 [2024-07-12 10:39:03.646314] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.582 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.840 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.840 "name": "Existed_Raid", 00:12:28.840 "uuid": "0deb969b-d6fb-4c8f-b07b-d61e422d1631", 00:12:28.840 "strip_size_kb": 0, 00:12:28.840 "state": "configuring", 00:12:28.840 "raid_level": "raid1", 00:12:28.840 "superblock": true, 00:12:28.840 "num_base_bdevs": 2, 00:12:28.840 "num_base_bdevs_discovered": 0, 00:12:28.840 "num_base_bdevs_operational": 2, 00:12:28.840 "base_bdevs_list": [ 00:12:28.840 { 00:12:28.840 "name": "BaseBdev1", 00:12:28.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.840 "is_configured": false, 00:12:28.840 "data_offset": 0, 00:12:28.840 "data_size": 0 00:12:28.840 }, 00:12:28.840 { 00:12:28.840 "name": "BaseBdev2", 00:12:28.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.840 "is_configured": false, 00:12:28.840 "data_offset": 0, 00:12:28.840 "data_size": 0 00:12:28.840 } 00:12:28.840 ] 00:12:28.840 }' 00:12:28.840 10:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.840 10:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:29.407 10:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:29.665 [2024-07-12 10:39:04.737004] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:29.665 [2024-07-12 10:39:04.737038] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bbda80 name Existed_Raid, state configuring 00:12:29.665 10:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:29.924 [2024-07-12 10:39:04.981669] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:29.924 [2024-07-12 10:39:04.981701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:29.924 [2024-07-12 10:39:04.981711] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:29.924 [2024-07-12 10:39:04.981722] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:29.924 10:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:30.182 [2024-07-12 10:39:05.237337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:30.182 BaseBdev1 00:12:30.182 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:30.182 10:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:30.182 10:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:30.182 10:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:30.182 10:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:30.182 10:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:30.182 10:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:30.441 10:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:30.700 [ 00:12:30.700 { 00:12:30.700 "name": "BaseBdev1", 00:12:30.700 "aliases": [ 00:12:30.700 "9d59efa2-22ba-470c-8299-09cf8f26c88a" 00:12:30.700 ], 00:12:30.700 "product_name": "Malloc disk", 00:12:30.700 "block_size": 512, 00:12:30.700 "num_blocks": 65536, 00:12:30.700 "uuid": "9d59efa2-22ba-470c-8299-09cf8f26c88a", 00:12:30.700 "assigned_rate_limits": { 00:12:30.700 "rw_ios_per_sec": 0, 00:12:30.700 "rw_mbytes_per_sec": 0, 00:12:30.700 "r_mbytes_per_sec": 0, 00:12:30.700 "w_mbytes_per_sec": 0 00:12:30.700 }, 00:12:30.700 "claimed": true, 00:12:30.700 "claim_type": "exclusive_write", 00:12:30.700 "zoned": false, 00:12:30.700 "supported_io_types": { 00:12:30.700 "read": true, 00:12:30.700 "write": true, 00:12:30.700 "unmap": true, 00:12:30.700 "flush": true, 00:12:30.700 "reset": true, 00:12:30.700 "nvme_admin": false, 00:12:30.700 "nvme_io": false, 00:12:30.700 "nvme_io_md": false, 00:12:30.700 "write_zeroes": true, 00:12:30.700 "zcopy": true, 00:12:30.700 "get_zone_info": false, 00:12:30.700 "zone_management": false, 00:12:30.700 "zone_append": false, 00:12:30.700 "compare": false, 00:12:30.700 "compare_and_write": false, 00:12:30.700 "abort": true, 00:12:30.700 "seek_hole": false, 00:12:30.700 "seek_data": false, 00:12:30.700 "copy": true, 00:12:30.700 "nvme_iov_md": false 00:12:30.700 }, 00:12:30.700 "memory_domains": [ 00:12:30.700 { 00:12:30.700 "dma_device_id": "system", 00:12:30.700 "dma_device_type": 1 00:12:30.700 }, 00:12:30.700 { 00:12:30.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.700 "dma_device_type": 2 00:12:30.700 } 00:12:30.700 ], 00:12:30.700 "driver_specific": {} 00:12:30.700 } 00:12:30.700 ] 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.700 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.958 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.958 "name": "Existed_Raid", 00:12:30.958 "uuid": "0014c410-1110-4750-af93-ae2de0885dd0", 00:12:30.958 "strip_size_kb": 0, 00:12:30.958 "state": "configuring", 00:12:30.958 "raid_level": "raid1", 00:12:30.958 "superblock": true, 00:12:30.958 "num_base_bdevs": 2, 00:12:30.958 "num_base_bdevs_discovered": 1, 00:12:30.958 "num_base_bdevs_operational": 2, 00:12:30.958 "base_bdevs_list": [ 00:12:30.958 { 00:12:30.958 "name": "BaseBdev1", 00:12:30.958 "uuid": "9d59efa2-22ba-470c-8299-09cf8f26c88a", 00:12:30.958 "is_configured": true, 00:12:30.958 "data_offset": 2048, 00:12:30.958 "data_size": 63488 00:12:30.958 }, 00:12:30.958 { 00:12:30.958 "name": "BaseBdev2", 00:12:30.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.958 "is_configured": false, 00:12:30.958 "data_offset": 0, 00:12:30.958 "data_size": 0 00:12:30.958 } 00:12:30.958 ] 00:12:30.958 }' 00:12:30.958 10:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.958 10:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:31.523 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:31.523 [2024-07-12 10:39:06.713259] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:31.523 [2024-07-12 10:39:06.713295] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bbd350 name Existed_Raid, state configuring 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:31.782 [2024-07-12 10:39:06.945912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:31.782 [2024-07-12 10:39:06.947539] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:31.782 [2024-07-12 10:39:06.947572] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.782 10:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.039 10:39:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.039 "name": "Existed_Raid", 00:12:32.039 "uuid": "ae140f00-34cc-4cb1-90f1-29d392df6e84", 00:12:32.039 "strip_size_kb": 0, 00:12:32.039 "state": "configuring", 00:12:32.039 "raid_level": "raid1", 00:12:32.039 "superblock": true, 00:12:32.039 "num_base_bdevs": 2, 00:12:32.039 "num_base_bdevs_discovered": 1, 00:12:32.039 "num_base_bdevs_operational": 2, 00:12:32.039 "base_bdevs_list": [ 00:12:32.039 { 00:12:32.039 "name": "BaseBdev1", 00:12:32.039 "uuid": "9d59efa2-22ba-470c-8299-09cf8f26c88a", 00:12:32.039 "is_configured": true, 00:12:32.039 "data_offset": 2048, 00:12:32.039 "data_size": 63488 00:12:32.039 }, 00:12:32.039 { 00:12:32.039 "name": "BaseBdev2", 00:12:32.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.039 "is_configured": false, 00:12:32.039 "data_offset": 0, 00:12:32.039 "data_size": 0 00:12:32.039 } 00:12:32.039 ] 00:12:32.039 }' 00:12:32.039 10:39:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.039 10:39:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:32.971 10:39:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:32.971 [2024-07-12 10:39:08.040181] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:32.971 [2024-07-12 10:39:08.040326] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bbe000 00:12:32.971 [2024-07-12 10:39:08.040339] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:32.971 [2024-07-12 10:39:08.040518] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ad80c0 00:12:32.971 [2024-07-12 10:39:08.040640] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bbe000 00:12:32.971 [2024-07-12 10:39:08.040651] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1bbe000 00:12:32.971 [2024-07-12 10:39:08.040743] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:32.971 BaseBdev2 00:12:32.971 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:32.971 10:39:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:32.971 10:39:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:32.971 10:39:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:32.971 10:39:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:32.971 10:39:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:32.971 10:39:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:33.229 10:39:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:33.488 [ 00:12:33.488 { 00:12:33.488 "name": "BaseBdev2", 00:12:33.488 "aliases": [ 00:12:33.488 "9ef264eb-ed27-40e2-8da3-591f432b96a8" 00:12:33.488 ], 00:12:33.488 "product_name": "Malloc disk", 00:12:33.488 "block_size": 512, 00:12:33.488 "num_blocks": 65536, 00:12:33.488 "uuid": "9ef264eb-ed27-40e2-8da3-591f432b96a8", 00:12:33.488 "assigned_rate_limits": { 00:12:33.488 "rw_ios_per_sec": 0, 00:12:33.488 "rw_mbytes_per_sec": 0, 00:12:33.488 "r_mbytes_per_sec": 0, 00:12:33.488 "w_mbytes_per_sec": 0 00:12:33.488 }, 00:12:33.488 "claimed": true, 00:12:33.488 "claim_type": "exclusive_write", 00:12:33.488 "zoned": false, 00:12:33.488 "supported_io_types": { 00:12:33.488 "read": true, 00:12:33.488 "write": true, 00:12:33.488 "unmap": true, 00:12:33.488 "flush": true, 00:12:33.488 "reset": true, 00:12:33.488 "nvme_admin": false, 00:12:33.488 "nvme_io": false, 00:12:33.488 "nvme_io_md": false, 00:12:33.488 "write_zeroes": true, 00:12:33.488 "zcopy": true, 00:12:33.488 "get_zone_info": false, 00:12:33.488 "zone_management": false, 00:12:33.488 "zone_append": false, 00:12:33.488 "compare": false, 00:12:33.488 "compare_and_write": false, 00:12:33.488 "abort": true, 00:12:33.488 "seek_hole": false, 00:12:33.488 "seek_data": false, 00:12:33.488 "copy": true, 00:12:33.488 "nvme_iov_md": false 00:12:33.488 }, 00:12:33.488 "memory_domains": [ 00:12:33.488 { 00:12:33.488 "dma_device_id": "system", 00:12:33.488 "dma_device_type": 1 00:12:33.488 }, 00:12:33.488 { 00:12:33.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.488 "dma_device_type": 2 00:12:33.488 } 00:12:33.488 ], 00:12:33.488 "driver_specific": {} 00:12:33.488 } 00:12:33.488 ] 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.488 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.747 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.747 "name": "Existed_Raid", 00:12:33.747 "uuid": "ae140f00-34cc-4cb1-90f1-29d392df6e84", 00:12:33.747 "strip_size_kb": 0, 00:12:33.747 "state": "online", 00:12:33.747 "raid_level": "raid1", 00:12:33.747 "superblock": true, 00:12:33.747 "num_base_bdevs": 2, 00:12:33.747 "num_base_bdevs_discovered": 2, 00:12:33.747 "num_base_bdevs_operational": 2, 00:12:33.747 "base_bdevs_list": [ 00:12:33.747 { 00:12:33.747 "name": "BaseBdev1", 00:12:33.747 "uuid": "9d59efa2-22ba-470c-8299-09cf8f26c88a", 00:12:33.747 "is_configured": true, 00:12:33.747 "data_offset": 2048, 00:12:33.747 "data_size": 63488 00:12:33.747 }, 00:12:33.747 { 00:12:33.747 "name": "BaseBdev2", 00:12:33.747 "uuid": "9ef264eb-ed27-40e2-8da3-591f432b96a8", 00:12:33.747 "is_configured": true, 00:12:33.747 "data_offset": 2048, 00:12:33.747 "data_size": 63488 00:12:33.747 } 00:12:33.747 ] 00:12:33.747 }' 00:12:33.747 10:39:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.747 10:39:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:34.315 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:34.315 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:34.315 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:34.315 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:34.315 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:34.315 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:34.315 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:34.315 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:34.573 [2024-07-12 10:39:09.612682] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:34.573 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:34.573 "name": "Existed_Raid", 00:12:34.573 "aliases": [ 00:12:34.573 "ae140f00-34cc-4cb1-90f1-29d392df6e84" 00:12:34.573 ], 00:12:34.573 "product_name": "Raid Volume", 00:12:34.573 "block_size": 512, 00:12:34.573 "num_blocks": 63488, 00:12:34.573 "uuid": "ae140f00-34cc-4cb1-90f1-29d392df6e84", 00:12:34.573 "assigned_rate_limits": { 00:12:34.573 "rw_ios_per_sec": 0, 00:12:34.573 "rw_mbytes_per_sec": 0, 00:12:34.573 "r_mbytes_per_sec": 0, 00:12:34.573 "w_mbytes_per_sec": 0 00:12:34.573 }, 00:12:34.573 "claimed": false, 00:12:34.573 "zoned": false, 00:12:34.573 "supported_io_types": { 00:12:34.573 "read": true, 00:12:34.573 "write": true, 00:12:34.573 "unmap": false, 00:12:34.573 "flush": false, 00:12:34.573 "reset": true, 00:12:34.573 "nvme_admin": false, 00:12:34.573 "nvme_io": false, 00:12:34.573 "nvme_io_md": false, 00:12:34.573 "write_zeroes": true, 00:12:34.573 "zcopy": false, 00:12:34.573 "get_zone_info": false, 00:12:34.573 "zone_management": false, 00:12:34.573 "zone_append": false, 00:12:34.573 "compare": false, 00:12:34.573 "compare_and_write": false, 00:12:34.573 "abort": false, 00:12:34.573 "seek_hole": false, 00:12:34.573 "seek_data": false, 00:12:34.573 "copy": false, 00:12:34.573 "nvme_iov_md": false 00:12:34.573 }, 00:12:34.573 "memory_domains": [ 00:12:34.573 { 00:12:34.573 "dma_device_id": "system", 00:12:34.573 "dma_device_type": 1 00:12:34.573 }, 00:12:34.573 { 00:12:34.573 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.573 "dma_device_type": 2 00:12:34.573 }, 00:12:34.573 { 00:12:34.573 "dma_device_id": "system", 00:12:34.573 "dma_device_type": 1 00:12:34.573 }, 00:12:34.573 { 00:12:34.573 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.573 "dma_device_type": 2 00:12:34.573 } 00:12:34.573 ], 00:12:34.573 "driver_specific": { 00:12:34.573 "raid": { 00:12:34.573 "uuid": "ae140f00-34cc-4cb1-90f1-29d392df6e84", 00:12:34.573 "strip_size_kb": 0, 00:12:34.573 "state": "online", 00:12:34.573 "raid_level": "raid1", 00:12:34.573 "superblock": true, 00:12:34.573 "num_base_bdevs": 2, 00:12:34.573 "num_base_bdevs_discovered": 2, 00:12:34.573 "num_base_bdevs_operational": 2, 00:12:34.573 "base_bdevs_list": [ 00:12:34.573 { 00:12:34.573 "name": "BaseBdev1", 00:12:34.573 "uuid": "9d59efa2-22ba-470c-8299-09cf8f26c88a", 00:12:34.573 "is_configured": true, 00:12:34.573 "data_offset": 2048, 00:12:34.573 "data_size": 63488 00:12:34.573 }, 00:12:34.573 { 00:12:34.573 "name": "BaseBdev2", 00:12:34.573 "uuid": "9ef264eb-ed27-40e2-8da3-591f432b96a8", 00:12:34.573 "is_configured": true, 00:12:34.573 "data_offset": 2048, 00:12:34.573 "data_size": 63488 00:12:34.573 } 00:12:34.573 ] 00:12:34.573 } 00:12:34.573 } 00:12:34.573 }' 00:12:34.573 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:34.573 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:34.573 BaseBdev2' 00:12:34.573 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:34.573 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:34.573 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:34.832 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:34.832 "name": "BaseBdev1", 00:12:34.832 "aliases": [ 00:12:34.832 "9d59efa2-22ba-470c-8299-09cf8f26c88a" 00:12:34.832 ], 00:12:34.832 "product_name": "Malloc disk", 00:12:34.832 "block_size": 512, 00:12:34.832 "num_blocks": 65536, 00:12:34.832 "uuid": "9d59efa2-22ba-470c-8299-09cf8f26c88a", 00:12:34.832 "assigned_rate_limits": { 00:12:34.832 "rw_ios_per_sec": 0, 00:12:34.832 "rw_mbytes_per_sec": 0, 00:12:34.832 "r_mbytes_per_sec": 0, 00:12:34.832 "w_mbytes_per_sec": 0 00:12:34.832 }, 00:12:34.832 "claimed": true, 00:12:34.832 "claim_type": "exclusive_write", 00:12:34.832 "zoned": false, 00:12:34.832 "supported_io_types": { 00:12:34.832 "read": true, 00:12:34.832 "write": true, 00:12:34.832 "unmap": true, 00:12:34.832 "flush": true, 00:12:34.832 "reset": true, 00:12:34.832 "nvme_admin": false, 00:12:34.832 "nvme_io": false, 00:12:34.832 "nvme_io_md": false, 00:12:34.832 "write_zeroes": true, 00:12:34.832 "zcopy": true, 00:12:34.832 "get_zone_info": false, 00:12:34.832 "zone_management": false, 00:12:34.832 "zone_append": false, 00:12:34.832 "compare": false, 00:12:34.832 "compare_and_write": false, 00:12:34.832 "abort": true, 00:12:34.832 "seek_hole": false, 00:12:34.832 "seek_data": false, 00:12:34.832 "copy": true, 00:12:34.832 "nvme_iov_md": false 00:12:34.832 }, 00:12:34.832 "memory_domains": [ 00:12:34.832 { 00:12:34.832 "dma_device_id": "system", 00:12:34.832 "dma_device_type": 1 00:12:34.832 }, 00:12:34.832 { 00:12:34.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.832 "dma_device_type": 2 00:12:34.832 } 00:12:34.832 ], 00:12:34.832 "driver_specific": {} 00:12:34.832 }' 00:12:34.832 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.832 10:39:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.832 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:34.832 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.091 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.091 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:35.091 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.091 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.091 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.091 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.091 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.349 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.349 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:35.349 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:35.349 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:35.349 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:35.349 "name": "BaseBdev2", 00:12:35.349 "aliases": [ 00:12:35.349 "9ef264eb-ed27-40e2-8da3-591f432b96a8" 00:12:35.349 ], 00:12:35.349 "product_name": "Malloc disk", 00:12:35.349 "block_size": 512, 00:12:35.349 "num_blocks": 65536, 00:12:35.349 "uuid": "9ef264eb-ed27-40e2-8da3-591f432b96a8", 00:12:35.349 "assigned_rate_limits": { 00:12:35.349 "rw_ios_per_sec": 0, 00:12:35.349 "rw_mbytes_per_sec": 0, 00:12:35.349 "r_mbytes_per_sec": 0, 00:12:35.349 "w_mbytes_per_sec": 0 00:12:35.349 }, 00:12:35.349 "claimed": true, 00:12:35.349 "claim_type": "exclusive_write", 00:12:35.349 "zoned": false, 00:12:35.349 "supported_io_types": { 00:12:35.349 "read": true, 00:12:35.349 "write": true, 00:12:35.349 "unmap": true, 00:12:35.349 "flush": true, 00:12:35.349 "reset": true, 00:12:35.349 "nvme_admin": false, 00:12:35.349 "nvme_io": false, 00:12:35.349 "nvme_io_md": false, 00:12:35.349 "write_zeroes": true, 00:12:35.349 "zcopy": true, 00:12:35.349 "get_zone_info": false, 00:12:35.349 "zone_management": false, 00:12:35.349 "zone_append": false, 00:12:35.349 "compare": false, 00:12:35.349 "compare_and_write": false, 00:12:35.349 "abort": true, 00:12:35.349 "seek_hole": false, 00:12:35.349 "seek_data": false, 00:12:35.349 "copy": true, 00:12:35.349 "nvme_iov_md": false 00:12:35.349 }, 00:12:35.349 "memory_domains": [ 00:12:35.349 { 00:12:35.349 "dma_device_id": "system", 00:12:35.349 "dma_device_type": 1 00:12:35.349 }, 00:12:35.349 { 00:12:35.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.349 "dma_device_type": 2 00:12:35.349 } 00:12:35.349 ], 00:12:35.349 "driver_specific": {} 00:12:35.349 }' 00:12:35.349 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.607 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.607 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:35.607 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.607 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.607 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:35.607 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.607 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.607 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.607 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.863 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.863 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.863 10:39:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:36.120 [2024-07-12 10:39:11.076353] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:36.120 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:36.120 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:36.120 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:36.120 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:12:36.120 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:36.120 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.121 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.378 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.378 "name": "Existed_Raid", 00:12:36.378 "uuid": "ae140f00-34cc-4cb1-90f1-29d392df6e84", 00:12:36.378 "strip_size_kb": 0, 00:12:36.378 "state": "online", 00:12:36.378 "raid_level": "raid1", 00:12:36.378 "superblock": true, 00:12:36.378 "num_base_bdevs": 2, 00:12:36.378 "num_base_bdevs_discovered": 1, 00:12:36.378 "num_base_bdevs_operational": 1, 00:12:36.378 "base_bdevs_list": [ 00:12:36.378 { 00:12:36.378 "name": null, 00:12:36.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.378 "is_configured": false, 00:12:36.378 "data_offset": 2048, 00:12:36.378 "data_size": 63488 00:12:36.378 }, 00:12:36.379 { 00:12:36.379 "name": "BaseBdev2", 00:12:36.379 "uuid": "9ef264eb-ed27-40e2-8da3-591f432b96a8", 00:12:36.379 "is_configured": true, 00:12:36.379 "data_offset": 2048, 00:12:36.379 "data_size": 63488 00:12:36.379 } 00:12:36.379 ] 00:12:36.379 }' 00:12:36.379 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.379 10:39:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:36.955 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:36.955 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:36.955 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.955 10:39:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:37.212 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:37.212 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:37.212 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:37.212 [2024-07-12 10:39:12.405795] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:37.212 [2024-07-12 10:39:12.405877] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:37.469 [2024-07-12 10:39:12.418515] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:37.469 [2024-07-12 10:39:12.418548] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:37.469 [2024-07-12 10:39:12.418559] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bbe000 name Existed_Raid, state offline 00:12:37.469 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:37.469 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:37.470 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.470 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2029503 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2029503 ']' 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2029503 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2029503 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2029503' 00:12:37.727 killing process with pid 2029503 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2029503 00:12:37.727 [2024-07-12 10:39:12.734005] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:37.727 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2029503 00:12:37.727 [2024-07-12 10:39:12.734975] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:37.984 10:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:37.984 00:12:37.984 real 0m10.489s 00:12:37.984 user 0m18.650s 00:12:37.984 sys 0m1.934s 00:12:37.984 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:37.984 10:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:37.984 ************************************ 00:12:37.984 END TEST raid_state_function_test_sb 00:12:37.984 ************************************ 00:12:37.984 10:39:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:37.984 10:39:13 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:12:37.984 10:39:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:37.984 10:39:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:37.984 10:39:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:37.984 ************************************ 00:12:37.984 START TEST raid_superblock_test 00:12:37.984 ************************************ 00:12:37.984 10:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:12:37.984 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:12:37.984 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:37.984 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:37.984 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:37.984 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:37.984 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2031407 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2031407 /var/tmp/spdk-raid.sock 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2031407 ']' 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:37.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:37.985 10:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:37.985 [2024-07-12 10:39:13.105000] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:12:37.985 [2024-07-12 10:39:13.105065] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2031407 ] 00:12:38.242 [2024-07-12 10:39:13.232152] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:38.242 [2024-07-12 10:39:13.339450] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:38.242 [2024-07-12 10:39:13.403587] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:38.242 [2024-07-12 10:39:13.403628] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:39.178 malloc1 00:12:39.178 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:39.437 [2024-07-12 10:39:14.517029] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:39.437 [2024-07-12 10:39:14.517075] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:39.437 [2024-07-12 10:39:14.517096] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc85570 00:12:39.437 [2024-07-12 10:39:14.517109] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:39.437 [2024-07-12 10:39:14.518872] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:39.437 [2024-07-12 10:39:14.518901] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:39.437 pt1 00:12:39.437 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:39.437 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:39.437 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:39.437 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:39.437 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:39.437 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:39.437 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:39.437 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:39.437 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:39.695 malloc2 00:12:39.695 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:39.953 [2024-07-12 10:39:14.926858] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:39.953 [2024-07-12 10:39:14.926903] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:39.953 [2024-07-12 10:39:14.926920] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc86970 00:12:39.953 [2024-07-12 10:39:14.926933] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:39.953 [2024-07-12 10:39:14.928518] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:39.953 [2024-07-12 10:39:14.928545] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:39.953 pt2 00:12:39.953 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:39.953 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:39.953 10:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:40.213 [2024-07-12 10:39:15.155478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:40.213 [2024-07-12 10:39:15.156790] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:40.213 [2024-07-12 10:39:15.156938] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe29270 00:12:40.213 [2024-07-12 10:39:15.156951] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:40.213 [2024-07-12 10:39:15.157152] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7d0e0 00:12:40.213 [2024-07-12 10:39:15.157296] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe29270 00:12:40.213 [2024-07-12 10:39:15.157306] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe29270 00:12:40.213 [2024-07-12 10:39:15.157402] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.213 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:40.472 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.472 "name": "raid_bdev1", 00:12:40.472 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:40.472 "strip_size_kb": 0, 00:12:40.472 "state": "online", 00:12:40.472 "raid_level": "raid1", 00:12:40.472 "superblock": true, 00:12:40.472 "num_base_bdevs": 2, 00:12:40.472 "num_base_bdevs_discovered": 2, 00:12:40.472 "num_base_bdevs_operational": 2, 00:12:40.472 "base_bdevs_list": [ 00:12:40.472 { 00:12:40.472 "name": "pt1", 00:12:40.472 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:40.472 "is_configured": true, 00:12:40.472 "data_offset": 2048, 00:12:40.472 "data_size": 63488 00:12:40.472 }, 00:12:40.472 { 00:12:40.472 "name": "pt2", 00:12:40.472 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:40.472 "is_configured": true, 00:12:40.472 "data_offset": 2048, 00:12:40.472 "data_size": 63488 00:12:40.472 } 00:12:40.472 ] 00:12:40.472 }' 00:12:40.472 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.472 10:39:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.039 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:41.039 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:41.039 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:41.039 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:41.039 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:41.039 10:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:41.039 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:41.039 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:41.039 [2024-07-12 10:39:16.226541] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:41.299 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:41.299 "name": "raid_bdev1", 00:12:41.299 "aliases": [ 00:12:41.299 "14a234e9-f92e-4c2b-8d94-26bf44aab4c2" 00:12:41.299 ], 00:12:41.299 "product_name": "Raid Volume", 00:12:41.299 "block_size": 512, 00:12:41.299 "num_blocks": 63488, 00:12:41.299 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:41.299 "assigned_rate_limits": { 00:12:41.299 "rw_ios_per_sec": 0, 00:12:41.299 "rw_mbytes_per_sec": 0, 00:12:41.299 "r_mbytes_per_sec": 0, 00:12:41.299 "w_mbytes_per_sec": 0 00:12:41.299 }, 00:12:41.299 "claimed": false, 00:12:41.299 "zoned": false, 00:12:41.299 "supported_io_types": { 00:12:41.299 "read": true, 00:12:41.299 "write": true, 00:12:41.299 "unmap": false, 00:12:41.299 "flush": false, 00:12:41.299 "reset": true, 00:12:41.299 "nvme_admin": false, 00:12:41.299 "nvme_io": false, 00:12:41.299 "nvme_io_md": false, 00:12:41.299 "write_zeroes": true, 00:12:41.299 "zcopy": false, 00:12:41.299 "get_zone_info": false, 00:12:41.299 "zone_management": false, 00:12:41.299 "zone_append": false, 00:12:41.299 "compare": false, 00:12:41.299 "compare_and_write": false, 00:12:41.299 "abort": false, 00:12:41.299 "seek_hole": false, 00:12:41.299 "seek_data": false, 00:12:41.299 "copy": false, 00:12:41.299 "nvme_iov_md": false 00:12:41.299 }, 00:12:41.299 "memory_domains": [ 00:12:41.299 { 00:12:41.299 "dma_device_id": "system", 00:12:41.299 "dma_device_type": 1 00:12:41.299 }, 00:12:41.299 { 00:12:41.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.299 "dma_device_type": 2 00:12:41.299 }, 00:12:41.299 { 00:12:41.299 "dma_device_id": "system", 00:12:41.299 "dma_device_type": 1 00:12:41.299 }, 00:12:41.299 { 00:12:41.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.299 "dma_device_type": 2 00:12:41.299 } 00:12:41.299 ], 00:12:41.299 "driver_specific": { 00:12:41.299 "raid": { 00:12:41.299 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:41.299 "strip_size_kb": 0, 00:12:41.299 "state": "online", 00:12:41.299 "raid_level": "raid1", 00:12:41.299 "superblock": true, 00:12:41.299 "num_base_bdevs": 2, 00:12:41.299 "num_base_bdevs_discovered": 2, 00:12:41.299 "num_base_bdevs_operational": 2, 00:12:41.299 "base_bdevs_list": [ 00:12:41.299 { 00:12:41.299 "name": "pt1", 00:12:41.299 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:41.299 "is_configured": true, 00:12:41.299 "data_offset": 2048, 00:12:41.299 "data_size": 63488 00:12:41.299 }, 00:12:41.299 { 00:12:41.299 "name": "pt2", 00:12:41.299 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:41.299 "is_configured": true, 00:12:41.299 "data_offset": 2048, 00:12:41.299 "data_size": 63488 00:12:41.299 } 00:12:41.299 ] 00:12:41.299 } 00:12:41.299 } 00:12:41.299 }' 00:12:41.299 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:41.299 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:41.299 pt2' 00:12:41.299 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:41.299 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:41.299 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:41.588 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:41.588 "name": "pt1", 00:12:41.588 "aliases": [ 00:12:41.588 "00000000-0000-0000-0000-000000000001" 00:12:41.588 ], 00:12:41.588 "product_name": "passthru", 00:12:41.588 "block_size": 512, 00:12:41.588 "num_blocks": 65536, 00:12:41.588 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:41.588 "assigned_rate_limits": { 00:12:41.588 "rw_ios_per_sec": 0, 00:12:41.588 "rw_mbytes_per_sec": 0, 00:12:41.588 "r_mbytes_per_sec": 0, 00:12:41.588 "w_mbytes_per_sec": 0 00:12:41.588 }, 00:12:41.588 "claimed": true, 00:12:41.588 "claim_type": "exclusive_write", 00:12:41.588 "zoned": false, 00:12:41.588 "supported_io_types": { 00:12:41.588 "read": true, 00:12:41.588 "write": true, 00:12:41.588 "unmap": true, 00:12:41.588 "flush": true, 00:12:41.588 "reset": true, 00:12:41.588 "nvme_admin": false, 00:12:41.588 "nvme_io": false, 00:12:41.588 "nvme_io_md": false, 00:12:41.588 "write_zeroes": true, 00:12:41.588 "zcopy": true, 00:12:41.588 "get_zone_info": false, 00:12:41.588 "zone_management": false, 00:12:41.588 "zone_append": false, 00:12:41.588 "compare": false, 00:12:41.588 "compare_and_write": false, 00:12:41.588 "abort": true, 00:12:41.588 "seek_hole": false, 00:12:41.588 "seek_data": false, 00:12:41.588 "copy": true, 00:12:41.588 "nvme_iov_md": false 00:12:41.588 }, 00:12:41.588 "memory_domains": [ 00:12:41.588 { 00:12:41.588 "dma_device_id": "system", 00:12:41.588 "dma_device_type": 1 00:12:41.588 }, 00:12:41.588 { 00:12:41.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.588 "dma_device_type": 2 00:12:41.588 } 00:12:41.588 ], 00:12:41.588 "driver_specific": { 00:12:41.588 "passthru": { 00:12:41.588 "name": "pt1", 00:12:41.588 "base_bdev_name": "malloc1" 00:12:41.588 } 00:12:41.588 } 00:12:41.588 }' 00:12:41.588 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.588 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.588 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:41.588 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.588 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.588 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:41.588 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.588 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.864 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.864 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.864 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.864 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:41.864 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:41.864 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:41.864 10:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:42.122 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:42.122 "name": "pt2", 00:12:42.122 "aliases": [ 00:12:42.122 "00000000-0000-0000-0000-000000000002" 00:12:42.122 ], 00:12:42.122 "product_name": "passthru", 00:12:42.122 "block_size": 512, 00:12:42.122 "num_blocks": 65536, 00:12:42.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:42.122 "assigned_rate_limits": { 00:12:42.122 "rw_ios_per_sec": 0, 00:12:42.122 "rw_mbytes_per_sec": 0, 00:12:42.122 "r_mbytes_per_sec": 0, 00:12:42.122 "w_mbytes_per_sec": 0 00:12:42.122 }, 00:12:42.122 "claimed": true, 00:12:42.122 "claim_type": "exclusive_write", 00:12:42.122 "zoned": false, 00:12:42.122 "supported_io_types": { 00:12:42.122 "read": true, 00:12:42.122 "write": true, 00:12:42.122 "unmap": true, 00:12:42.122 "flush": true, 00:12:42.122 "reset": true, 00:12:42.122 "nvme_admin": false, 00:12:42.122 "nvme_io": false, 00:12:42.122 "nvme_io_md": false, 00:12:42.122 "write_zeroes": true, 00:12:42.122 "zcopy": true, 00:12:42.122 "get_zone_info": false, 00:12:42.122 "zone_management": false, 00:12:42.122 "zone_append": false, 00:12:42.122 "compare": false, 00:12:42.122 "compare_and_write": false, 00:12:42.122 "abort": true, 00:12:42.122 "seek_hole": false, 00:12:42.122 "seek_data": false, 00:12:42.122 "copy": true, 00:12:42.122 "nvme_iov_md": false 00:12:42.122 }, 00:12:42.122 "memory_domains": [ 00:12:42.122 { 00:12:42.122 "dma_device_id": "system", 00:12:42.122 "dma_device_type": 1 00:12:42.122 }, 00:12:42.122 { 00:12:42.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.122 "dma_device_type": 2 00:12:42.122 } 00:12:42.122 ], 00:12:42.122 "driver_specific": { 00:12:42.122 "passthru": { 00:12:42.122 "name": "pt2", 00:12:42.122 "base_bdev_name": "malloc2" 00:12:42.122 } 00:12:42.122 } 00:12:42.122 }' 00:12:42.122 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.122 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.122 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:42.122 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.122 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.122 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:42.122 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.380 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.380 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:42.380 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.380 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.380 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.380 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:42.380 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:42.637 [2024-07-12 10:39:17.718469] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:42.637 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=14a234e9-f92e-4c2b-8d94-26bf44aab4c2 00:12:42.637 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 14a234e9-f92e-4c2b-8d94-26bf44aab4c2 ']' 00:12:42.637 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:42.896 [2024-07-12 10:39:17.962880] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:42.896 [2024-07-12 10:39:17.962899] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:42.896 [2024-07-12 10:39:17.962949] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:42.896 [2024-07-12 10:39:17.963005] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:42.896 [2024-07-12 10:39:17.963016] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe29270 name raid_bdev1, state offline 00:12:42.896 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.896 10:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:43.154 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:43.154 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:43.154 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:43.154 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:43.412 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:43.412 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:43.670 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:43.670 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:43.929 10:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:44.187 [2024-07-12 10:39:19.194075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:44.187 [2024-07-12 10:39:19.195473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:44.187 [2024-07-12 10:39:19.195543] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:44.187 [2024-07-12 10:39:19.195582] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:44.187 [2024-07-12 10:39:19.195601] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:44.187 [2024-07-12 10:39:19.195610] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe28ff0 name raid_bdev1, state configuring 00:12:44.187 request: 00:12:44.187 { 00:12:44.187 "name": "raid_bdev1", 00:12:44.187 "raid_level": "raid1", 00:12:44.187 "base_bdevs": [ 00:12:44.187 "malloc1", 00:12:44.187 "malloc2" 00:12:44.187 ], 00:12:44.187 "superblock": false, 00:12:44.187 "method": "bdev_raid_create", 00:12:44.187 "req_id": 1 00:12:44.187 } 00:12:44.187 Got JSON-RPC error response 00:12:44.187 response: 00:12:44.187 { 00:12:44.187 "code": -17, 00:12:44.187 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:44.187 } 00:12:44.187 10:39:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:44.187 10:39:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:44.187 10:39:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:44.187 10:39:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:44.187 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.187 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:44.445 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:44.445 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:44.445 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:44.703 [2024-07-12 10:39:19.695336] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:44.703 [2024-07-12 10:39:19.695379] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:44.703 [2024-07-12 10:39:19.695400] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc857a0 00:12:44.703 [2024-07-12 10:39:19.695413] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:44.703 [2024-07-12 10:39:19.696996] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:44.703 [2024-07-12 10:39:19.697024] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:44.703 [2024-07-12 10:39:19.697086] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:44.703 [2024-07-12 10:39:19.697109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:44.703 pt1 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.703 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:44.962 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.962 "name": "raid_bdev1", 00:12:44.962 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:44.962 "strip_size_kb": 0, 00:12:44.962 "state": "configuring", 00:12:44.962 "raid_level": "raid1", 00:12:44.962 "superblock": true, 00:12:44.962 "num_base_bdevs": 2, 00:12:44.962 "num_base_bdevs_discovered": 1, 00:12:44.962 "num_base_bdevs_operational": 2, 00:12:44.962 "base_bdevs_list": [ 00:12:44.962 { 00:12:44.962 "name": "pt1", 00:12:44.962 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:44.962 "is_configured": true, 00:12:44.962 "data_offset": 2048, 00:12:44.962 "data_size": 63488 00:12:44.962 }, 00:12:44.962 { 00:12:44.962 "name": null, 00:12:44.962 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:44.962 "is_configured": false, 00:12:44.962 "data_offset": 2048, 00:12:44.962 "data_size": 63488 00:12:44.962 } 00:12:44.962 ] 00:12:44.962 }' 00:12:44.962 10:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.962 10:39:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:45.529 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:45.529 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:45.529 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:45.529 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:45.787 [2024-07-12 10:39:20.766185] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:45.787 [2024-07-12 10:39:20.766238] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:45.787 [2024-07-12 10:39:20.766257] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe1d6f0 00:12:45.787 [2024-07-12 10:39:20.766270] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:45.787 [2024-07-12 10:39:20.766616] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:45.787 [2024-07-12 10:39:20.766634] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:45.787 [2024-07-12 10:39:20.766696] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:45.787 [2024-07-12 10:39:20.766715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:45.787 [2024-07-12 10:39:20.766812] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe1e590 00:12:45.787 [2024-07-12 10:39:20.766823] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:45.787 [2024-07-12 10:39:20.766992] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7f540 00:12:45.787 [2024-07-12 10:39:20.767115] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe1e590 00:12:45.787 [2024-07-12 10:39:20.767125] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe1e590 00:12:45.787 [2024-07-12 10:39:20.767222] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:45.787 pt2 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.787 10:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:46.045 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.045 "name": "raid_bdev1", 00:12:46.045 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:46.045 "strip_size_kb": 0, 00:12:46.045 "state": "online", 00:12:46.045 "raid_level": "raid1", 00:12:46.045 "superblock": true, 00:12:46.045 "num_base_bdevs": 2, 00:12:46.045 "num_base_bdevs_discovered": 2, 00:12:46.045 "num_base_bdevs_operational": 2, 00:12:46.045 "base_bdevs_list": [ 00:12:46.045 { 00:12:46.045 "name": "pt1", 00:12:46.045 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:46.045 "is_configured": true, 00:12:46.045 "data_offset": 2048, 00:12:46.045 "data_size": 63488 00:12:46.045 }, 00:12:46.045 { 00:12:46.045 "name": "pt2", 00:12:46.045 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:46.045 "is_configured": true, 00:12:46.045 "data_offset": 2048, 00:12:46.045 "data_size": 63488 00:12:46.045 } 00:12:46.045 ] 00:12:46.045 }' 00:12:46.045 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.045 10:39:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.611 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:46.611 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:46.611 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:46.611 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:46.611 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:46.611 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:46.611 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:46.611 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:46.869 [2024-07-12 10:39:21.905435] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:46.869 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:46.869 "name": "raid_bdev1", 00:12:46.869 "aliases": [ 00:12:46.869 "14a234e9-f92e-4c2b-8d94-26bf44aab4c2" 00:12:46.869 ], 00:12:46.869 "product_name": "Raid Volume", 00:12:46.869 "block_size": 512, 00:12:46.869 "num_blocks": 63488, 00:12:46.869 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:46.869 "assigned_rate_limits": { 00:12:46.869 "rw_ios_per_sec": 0, 00:12:46.869 "rw_mbytes_per_sec": 0, 00:12:46.869 "r_mbytes_per_sec": 0, 00:12:46.869 "w_mbytes_per_sec": 0 00:12:46.869 }, 00:12:46.869 "claimed": false, 00:12:46.869 "zoned": false, 00:12:46.869 "supported_io_types": { 00:12:46.869 "read": true, 00:12:46.869 "write": true, 00:12:46.869 "unmap": false, 00:12:46.869 "flush": false, 00:12:46.869 "reset": true, 00:12:46.869 "nvme_admin": false, 00:12:46.869 "nvme_io": false, 00:12:46.869 "nvme_io_md": false, 00:12:46.869 "write_zeroes": true, 00:12:46.869 "zcopy": false, 00:12:46.869 "get_zone_info": false, 00:12:46.869 "zone_management": false, 00:12:46.869 "zone_append": false, 00:12:46.870 "compare": false, 00:12:46.870 "compare_and_write": false, 00:12:46.870 "abort": false, 00:12:46.870 "seek_hole": false, 00:12:46.870 "seek_data": false, 00:12:46.870 "copy": false, 00:12:46.870 "nvme_iov_md": false 00:12:46.870 }, 00:12:46.870 "memory_domains": [ 00:12:46.870 { 00:12:46.870 "dma_device_id": "system", 00:12:46.870 "dma_device_type": 1 00:12:46.870 }, 00:12:46.870 { 00:12:46.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.870 "dma_device_type": 2 00:12:46.870 }, 00:12:46.870 { 00:12:46.870 "dma_device_id": "system", 00:12:46.870 "dma_device_type": 1 00:12:46.870 }, 00:12:46.870 { 00:12:46.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.870 "dma_device_type": 2 00:12:46.870 } 00:12:46.870 ], 00:12:46.870 "driver_specific": { 00:12:46.870 "raid": { 00:12:46.870 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:46.870 "strip_size_kb": 0, 00:12:46.870 "state": "online", 00:12:46.870 "raid_level": "raid1", 00:12:46.870 "superblock": true, 00:12:46.870 "num_base_bdevs": 2, 00:12:46.870 "num_base_bdevs_discovered": 2, 00:12:46.870 "num_base_bdevs_operational": 2, 00:12:46.870 "base_bdevs_list": [ 00:12:46.870 { 00:12:46.870 "name": "pt1", 00:12:46.870 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:46.870 "is_configured": true, 00:12:46.870 "data_offset": 2048, 00:12:46.870 "data_size": 63488 00:12:46.870 }, 00:12:46.870 { 00:12:46.870 "name": "pt2", 00:12:46.870 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:46.870 "is_configured": true, 00:12:46.870 "data_offset": 2048, 00:12:46.870 "data_size": 63488 00:12:46.870 } 00:12:46.870 ] 00:12:46.870 } 00:12:46.870 } 00:12:46.870 }' 00:12:46.870 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:46.870 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:46.870 pt2' 00:12:46.870 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:46.870 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:46.870 10:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:47.127 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:47.127 "name": "pt1", 00:12:47.127 "aliases": [ 00:12:47.127 "00000000-0000-0000-0000-000000000001" 00:12:47.127 ], 00:12:47.127 "product_name": "passthru", 00:12:47.127 "block_size": 512, 00:12:47.127 "num_blocks": 65536, 00:12:47.127 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:47.127 "assigned_rate_limits": { 00:12:47.127 "rw_ios_per_sec": 0, 00:12:47.127 "rw_mbytes_per_sec": 0, 00:12:47.128 "r_mbytes_per_sec": 0, 00:12:47.128 "w_mbytes_per_sec": 0 00:12:47.128 }, 00:12:47.128 "claimed": true, 00:12:47.128 "claim_type": "exclusive_write", 00:12:47.128 "zoned": false, 00:12:47.128 "supported_io_types": { 00:12:47.128 "read": true, 00:12:47.128 "write": true, 00:12:47.128 "unmap": true, 00:12:47.128 "flush": true, 00:12:47.128 "reset": true, 00:12:47.128 "nvme_admin": false, 00:12:47.128 "nvme_io": false, 00:12:47.128 "nvme_io_md": false, 00:12:47.128 "write_zeroes": true, 00:12:47.128 "zcopy": true, 00:12:47.128 "get_zone_info": false, 00:12:47.128 "zone_management": false, 00:12:47.128 "zone_append": false, 00:12:47.128 "compare": false, 00:12:47.128 "compare_and_write": false, 00:12:47.128 "abort": true, 00:12:47.128 "seek_hole": false, 00:12:47.128 "seek_data": false, 00:12:47.128 "copy": true, 00:12:47.128 "nvme_iov_md": false 00:12:47.128 }, 00:12:47.128 "memory_domains": [ 00:12:47.128 { 00:12:47.128 "dma_device_id": "system", 00:12:47.128 "dma_device_type": 1 00:12:47.128 }, 00:12:47.128 { 00:12:47.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.128 "dma_device_type": 2 00:12:47.128 } 00:12:47.128 ], 00:12:47.128 "driver_specific": { 00:12:47.128 "passthru": { 00:12:47.128 "name": "pt1", 00:12:47.128 "base_bdev_name": "malloc1" 00:12:47.128 } 00:12:47.128 } 00:12:47.128 }' 00:12:47.128 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.128 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.128 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:47.128 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:47.386 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:47.644 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:47.644 "name": "pt2", 00:12:47.644 "aliases": [ 00:12:47.644 "00000000-0000-0000-0000-000000000002" 00:12:47.644 ], 00:12:47.644 "product_name": "passthru", 00:12:47.644 "block_size": 512, 00:12:47.644 "num_blocks": 65536, 00:12:47.644 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:47.644 "assigned_rate_limits": { 00:12:47.644 "rw_ios_per_sec": 0, 00:12:47.644 "rw_mbytes_per_sec": 0, 00:12:47.644 "r_mbytes_per_sec": 0, 00:12:47.644 "w_mbytes_per_sec": 0 00:12:47.644 }, 00:12:47.644 "claimed": true, 00:12:47.644 "claim_type": "exclusive_write", 00:12:47.644 "zoned": false, 00:12:47.644 "supported_io_types": { 00:12:47.644 "read": true, 00:12:47.644 "write": true, 00:12:47.644 "unmap": true, 00:12:47.644 "flush": true, 00:12:47.644 "reset": true, 00:12:47.644 "nvme_admin": false, 00:12:47.644 "nvme_io": false, 00:12:47.644 "nvme_io_md": false, 00:12:47.644 "write_zeroes": true, 00:12:47.644 "zcopy": true, 00:12:47.644 "get_zone_info": false, 00:12:47.644 "zone_management": false, 00:12:47.644 "zone_append": false, 00:12:47.644 "compare": false, 00:12:47.644 "compare_and_write": false, 00:12:47.644 "abort": true, 00:12:47.644 "seek_hole": false, 00:12:47.644 "seek_data": false, 00:12:47.644 "copy": true, 00:12:47.644 "nvme_iov_md": false 00:12:47.644 }, 00:12:47.644 "memory_domains": [ 00:12:47.644 { 00:12:47.644 "dma_device_id": "system", 00:12:47.644 "dma_device_type": 1 00:12:47.644 }, 00:12:47.644 { 00:12:47.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.644 "dma_device_type": 2 00:12:47.644 } 00:12:47.644 ], 00:12:47.644 "driver_specific": { 00:12:47.644 "passthru": { 00:12:47.644 "name": "pt2", 00:12:47.644 "base_bdev_name": "malloc2" 00:12:47.644 } 00:12:47.644 } 00:12:47.644 }' 00:12:47.644 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.902 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.902 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:47.902 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.902 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.902 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:47.902 10:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.902 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.902 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:47.902 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.158 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.158 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:48.158 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:48.158 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:48.415 [2024-07-12 10:39:23.381352] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:48.415 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 14a234e9-f92e-4c2b-8d94-26bf44aab4c2 '!=' 14a234e9-f92e-4c2b-8d94-26bf44aab4c2 ']' 00:12:48.415 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:12:48.415 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:48.415 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:48.415 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:48.673 [2024-07-12 10:39:23.629793] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.673 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:48.931 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.931 "name": "raid_bdev1", 00:12:48.931 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:48.931 "strip_size_kb": 0, 00:12:48.931 "state": "online", 00:12:48.931 "raid_level": "raid1", 00:12:48.931 "superblock": true, 00:12:48.931 "num_base_bdevs": 2, 00:12:48.931 "num_base_bdevs_discovered": 1, 00:12:48.931 "num_base_bdevs_operational": 1, 00:12:48.931 "base_bdevs_list": [ 00:12:48.931 { 00:12:48.931 "name": null, 00:12:48.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.931 "is_configured": false, 00:12:48.931 "data_offset": 2048, 00:12:48.931 "data_size": 63488 00:12:48.931 }, 00:12:48.931 { 00:12:48.931 "name": "pt2", 00:12:48.931 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:48.931 "is_configured": true, 00:12:48.931 "data_offset": 2048, 00:12:48.931 "data_size": 63488 00:12:48.931 } 00:12:48.931 ] 00:12:48.931 }' 00:12:48.931 10:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.931 10:39:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.497 10:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:49.755 [2024-07-12 10:39:24.700611] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:49.755 [2024-07-12 10:39:24.700637] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:49.755 [2024-07-12 10:39:24.700688] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:49.755 [2024-07-12 10:39:24.700727] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:49.755 [2024-07-12 10:39:24.700739] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe1e590 name raid_bdev1, state offline 00:12:49.755 10:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.755 10:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:12:50.014 10:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:12:50.014 10:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:12:50.014 10:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:12:50.014 10:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:50.014 10:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:50.014 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:12:50.014 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:50.014 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:12:50.014 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:12:50.014 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:12:50.014 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:50.273 [2024-07-12 10:39:25.430515] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:50.273 [2024-07-12 10:39:25.430560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:50.273 [2024-07-12 10:39:25.430578] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc86160 00:12:50.273 [2024-07-12 10:39:25.430590] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:50.273 [2024-07-12 10:39:25.432174] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:50.273 [2024-07-12 10:39:25.432203] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:50.273 [2024-07-12 10:39:25.432266] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:50.273 [2024-07-12 10:39:25.432291] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:50.273 [2024-07-12 10:39:25.432371] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc7c380 00:12:50.273 [2024-07-12 10:39:25.432382] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:50.273 [2024-07-12 10:39:25.432563] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7da80 00:12:50.273 [2024-07-12 10:39:25.432685] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc7c380 00:12:50.273 [2024-07-12 10:39:25.432695] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc7c380 00:12:50.273 [2024-07-12 10:39:25.432790] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:50.273 pt2 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:50.273 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.531 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:50.531 "name": "raid_bdev1", 00:12:50.531 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:50.531 "strip_size_kb": 0, 00:12:50.531 "state": "online", 00:12:50.531 "raid_level": "raid1", 00:12:50.531 "superblock": true, 00:12:50.531 "num_base_bdevs": 2, 00:12:50.531 "num_base_bdevs_discovered": 1, 00:12:50.531 "num_base_bdevs_operational": 1, 00:12:50.531 "base_bdevs_list": [ 00:12:50.531 { 00:12:50.531 "name": null, 00:12:50.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:50.531 "is_configured": false, 00:12:50.531 "data_offset": 2048, 00:12:50.531 "data_size": 63488 00:12:50.531 }, 00:12:50.531 { 00:12:50.531 "name": "pt2", 00:12:50.531 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:50.531 "is_configured": true, 00:12:50.531 "data_offset": 2048, 00:12:50.531 "data_size": 63488 00:12:50.531 } 00:12:50.531 ] 00:12:50.531 }' 00:12:50.531 10:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:50.531 10:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:51.466 10:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:51.466 [2024-07-12 10:39:26.533453] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:51.466 [2024-07-12 10:39:26.533475] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:51.466 [2024-07-12 10:39:26.533530] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:51.466 [2024-07-12 10:39:26.533572] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:51.466 [2024-07-12 10:39:26.533583] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc7c380 name raid_bdev1, state offline 00:12:51.466 10:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.466 10:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:12:51.724 10:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:12:51.724 10:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:12:51.724 10:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:12:51.724 10:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:51.982 [2024-07-12 10:39:27.030764] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:51.982 [2024-07-12 10:39:27.030804] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.982 [2024-07-12 10:39:27.030822] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe28520 00:12:51.982 [2024-07-12 10:39:27.030834] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.982 [2024-07-12 10:39:27.032411] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.982 [2024-07-12 10:39:27.032439] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:51.982 [2024-07-12 10:39:27.032512] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:51.982 [2024-07-12 10:39:27.032536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:51.982 [2024-07-12 10:39:27.032630] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:12:51.982 [2024-07-12 10:39:27.032643] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:51.982 [2024-07-12 10:39:27.032656] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc7d3f0 name raid_bdev1, state configuring 00:12:51.982 [2024-07-12 10:39:27.032678] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:51.982 [2024-07-12 10:39:27.032735] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc7f2b0 00:12:51.982 [2024-07-12 10:39:27.032746] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:51.982 [2024-07-12 10:39:27.032907] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7c350 00:12:51.982 [2024-07-12 10:39:27.033027] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc7f2b0 00:12:51.982 [2024-07-12 10:39:27.033037] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc7f2b0 00:12:51.982 [2024-07-12 10:39:27.033133] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:51.982 pt1 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.982 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.983 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.983 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.983 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:52.242 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.242 "name": "raid_bdev1", 00:12:52.242 "uuid": "14a234e9-f92e-4c2b-8d94-26bf44aab4c2", 00:12:52.242 "strip_size_kb": 0, 00:12:52.242 "state": "online", 00:12:52.242 "raid_level": "raid1", 00:12:52.242 "superblock": true, 00:12:52.242 "num_base_bdevs": 2, 00:12:52.242 "num_base_bdevs_discovered": 1, 00:12:52.242 "num_base_bdevs_operational": 1, 00:12:52.242 "base_bdevs_list": [ 00:12:52.242 { 00:12:52.242 "name": null, 00:12:52.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:52.242 "is_configured": false, 00:12:52.242 "data_offset": 2048, 00:12:52.242 "data_size": 63488 00:12:52.242 }, 00:12:52.242 { 00:12:52.242 "name": "pt2", 00:12:52.242 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:52.242 "is_configured": true, 00:12:52.242 "data_offset": 2048, 00:12:52.242 "data_size": 63488 00:12:52.242 } 00:12:52.242 ] 00:12:52.242 }' 00:12:52.242 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.242 10:39:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.808 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:52.808 10:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:12:53.066 10:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:12:53.066 10:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:53.066 10:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:12:53.323 [2024-07-12 10:39:28.366507] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 14a234e9-f92e-4c2b-8d94-26bf44aab4c2 '!=' 14a234e9-f92e-4c2b-8d94-26bf44aab4c2 ']' 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2031407 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2031407 ']' 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2031407 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2031407 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2031407' 00:12:53.323 killing process with pid 2031407 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2031407 00:12:53.323 [2024-07-12 10:39:28.427274] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:53.323 [2024-07-12 10:39:28.427325] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:53.323 [2024-07-12 10:39:28.427366] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:53.323 [2024-07-12 10:39:28.427377] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc7f2b0 name raid_bdev1, state offline 00:12:53.323 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2031407 00:12:53.323 [2024-07-12 10:39:28.446602] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:53.581 10:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:53.581 00:12:53.581 real 0m15.625s 00:12:53.581 user 0m28.412s 00:12:53.581 sys 0m2.835s 00:12:53.581 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:53.581 10:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.581 ************************************ 00:12:53.581 END TEST raid_superblock_test 00:12:53.581 ************************************ 00:12:53.581 10:39:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:53.581 10:39:28 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:12:53.581 10:39:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:53.581 10:39:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:53.581 10:39:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:53.581 ************************************ 00:12:53.581 START TEST raid_read_error_test 00:12:53.581 ************************************ 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:53.581 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.brz40CQCLm 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2033805 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2033805 /var/tmp/spdk-raid.sock 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2033805 ']' 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:53.582 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:53.582 10:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.840 [2024-07-12 10:39:28.826772] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:12:53.840 [2024-07-12 10:39:28.826826] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2033805 ] 00:12:53.840 [2024-07-12 10:39:28.937054] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:54.098 [2024-07-12 10:39:29.039242] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:54.098 [2024-07-12 10:39:29.100310] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:54.098 [2024-07-12 10:39:29.100347] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:54.664 10:39:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:54.664 10:39:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:54.664 10:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:54.664 10:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:54.921 BaseBdev1_malloc 00:12:54.921 10:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:55.249 true 00:12:55.249 10:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:55.541 [2024-07-12 10:39:30.501903] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:55.541 [2024-07-12 10:39:30.501954] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:55.541 [2024-07-12 10:39:30.501975] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e50d0 00:12:55.541 [2024-07-12 10:39:30.501987] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:55.541 [2024-07-12 10:39:30.503922] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:55.541 [2024-07-12 10:39:30.503950] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:55.541 BaseBdev1 00:12:55.541 10:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:55.541 10:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:55.799 BaseBdev2_malloc 00:12:55.799 10:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:56.057 true 00:12:56.057 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:56.057 [2024-07-12 10:39:31.237190] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:56.057 [2024-07-12 10:39:31.237240] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:56.057 [2024-07-12 10:39:31.237262] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e9910 00:12:56.057 [2024-07-12 10:39:31.237274] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:56.057 [2024-07-12 10:39:31.238884] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:56.057 [2024-07-12 10:39:31.238914] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:56.057 BaseBdev2 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:56.315 [2024-07-12 10:39:31.477850] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:56.315 [2024-07-12 10:39:31.479229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:56.315 [2024-07-12 10:39:31.479431] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18eb320 00:12:56.315 [2024-07-12 10:39:31.479445] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:56.315 [2024-07-12 10:39:31.479658] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1752d00 00:12:56.315 [2024-07-12 10:39:31.479816] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18eb320 00:12:56.315 [2024-07-12 10:39:31.479826] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18eb320 00:12:56.315 [2024-07-12 10:39:31.479938] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.315 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:56.574 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.574 "name": "raid_bdev1", 00:12:56.574 "uuid": "f0d38a10-bccd-4a6b-bebb-7768e56d3ef9", 00:12:56.574 "strip_size_kb": 0, 00:12:56.574 "state": "online", 00:12:56.574 "raid_level": "raid1", 00:12:56.574 "superblock": true, 00:12:56.574 "num_base_bdevs": 2, 00:12:56.574 "num_base_bdevs_discovered": 2, 00:12:56.574 "num_base_bdevs_operational": 2, 00:12:56.574 "base_bdevs_list": [ 00:12:56.574 { 00:12:56.574 "name": "BaseBdev1", 00:12:56.574 "uuid": "44e5034e-9372-537f-b8ca-d804abd45565", 00:12:56.574 "is_configured": true, 00:12:56.574 "data_offset": 2048, 00:12:56.574 "data_size": 63488 00:12:56.574 }, 00:12:56.574 { 00:12:56.574 "name": "BaseBdev2", 00:12:56.574 "uuid": "0b144f2e-1fb6-5aeb-a3f1-eddc9c435bb5", 00:12:56.574 "is_configured": true, 00:12:56.574 "data_offset": 2048, 00:12:56.574 "data_size": 63488 00:12:56.574 } 00:12:56.574 ] 00:12:56.574 }' 00:12:56.574 10:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.574 10:39:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:57.142 10:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:57.142 10:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:57.404 [2024-07-12 10:39:32.412736] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e6c70 00:12:58.339 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.598 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:58.856 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:58.856 "name": "raid_bdev1", 00:12:58.856 "uuid": "f0d38a10-bccd-4a6b-bebb-7768e56d3ef9", 00:12:58.856 "strip_size_kb": 0, 00:12:58.856 "state": "online", 00:12:58.856 "raid_level": "raid1", 00:12:58.856 "superblock": true, 00:12:58.856 "num_base_bdevs": 2, 00:12:58.856 "num_base_bdevs_discovered": 2, 00:12:58.856 "num_base_bdevs_operational": 2, 00:12:58.856 "base_bdevs_list": [ 00:12:58.856 { 00:12:58.856 "name": "BaseBdev1", 00:12:58.857 "uuid": "44e5034e-9372-537f-b8ca-d804abd45565", 00:12:58.857 "is_configured": true, 00:12:58.857 "data_offset": 2048, 00:12:58.857 "data_size": 63488 00:12:58.857 }, 00:12:58.857 { 00:12:58.857 "name": "BaseBdev2", 00:12:58.857 "uuid": "0b144f2e-1fb6-5aeb-a3f1-eddc9c435bb5", 00:12:58.857 "is_configured": true, 00:12:58.857 "data_offset": 2048, 00:12:58.857 "data_size": 63488 00:12:58.857 } 00:12:58.857 ] 00:12:58.857 }' 00:12:58.857 10:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:58.857 10:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.424 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:59.424 [2024-07-12 10:39:34.551310] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:59.424 [2024-07-12 10:39:34.551349] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:59.424 [2024-07-12 10:39:34.554576] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:59.424 [2024-07-12 10:39:34.554606] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:59.424 [2024-07-12 10:39:34.554687] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:59.424 [2024-07-12 10:39:34.554698] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18eb320 name raid_bdev1, state offline 00:12:59.424 0 00:12:59.424 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2033805 00:12:59.424 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2033805 ']' 00:12:59.424 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2033805 00:12:59.424 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:59.424 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:59.424 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2033805 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2033805' 00:12:59.683 killing process with pid 2033805 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2033805 00:12:59.683 [2024-07-12 10:39:34.620652] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2033805 00:12:59.683 [2024-07-12 10:39:34.631323] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.brz40CQCLm 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:59.683 10:39:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:59.684 00:12:59.684 real 0m6.113s 00:12:59.684 user 0m9.477s 00:12:59.684 sys 0m1.109s 00:12:59.684 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:59.684 10:39:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.684 ************************************ 00:12:59.684 END TEST raid_read_error_test 00:12:59.684 ************************************ 00:12:59.942 10:39:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:59.942 10:39:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:12:59.942 10:39:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:59.942 10:39:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:59.942 10:39:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:59.942 ************************************ 00:12:59.942 START TEST raid_write_error_test 00:12:59.942 ************************************ 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.O0qT4gkDgV 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2034775 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2034775 /var/tmp/spdk-raid.sock 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2034775 ']' 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:59.942 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.942 10:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:59.942 [2024-07-12 10:39:35.018151] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:12:59.942 [2024-07-12 10:39:35.018215] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2034775 ] 00:13:00.201 [2024-07-12 10:39:35.145741] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:00.201 [2024-07-12 10:39:35.248061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:00.201 [2024-07-12 10:39:35.308858] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:00.201 [2024-07-12 10:39:35.308896] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:00.789 10:39:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:00.789 10:39:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:00.789 10:39:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:00.789 10:39:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:01.047 BaseBdev1_malloc 00:13:01.047 10:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:01.047 true 00:13:01.047 10:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:01.306 [2024-07-12 10:39:36.337005] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:01.306 [2024-07-12 10:39:36.337051] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:01.306 [2024-07-12 10:39:36.337072] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24150d0 00:13:01.306 [2024-07-12 10:39:36.337085] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:01.306 [2024-07-12 10:39:36.338937] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:01.306 [2024-07-12 10:39:36.338966] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:01.306 BaseBdev1 00:13:01.306 10:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:01.306 10:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:01.871 BaseBdev2_malloc 00:13:01.871 10:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:01.871 true 00:13:01.871 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:02.436 [2024-07-12 10:39:37.512638] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:02.436 [2024-07-12 10:39:37.512683] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:02.436 [2024-07-12 10:39:37.512704] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2419910 00:13:02.436 [2024-07-12 10:39:37.512717] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:02.436 [2024-07-12 10:39:37.514314] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:02.436 [2024-07-12 10:39:37.514342] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:02.436 BaseBdev2 00:13:02.437 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:02.694 [2024-07-12 10:39:37.693134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:02.694 [2024-07-12 10:39:37.694533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:02.694 [2024-07-12 10:39:37.694728] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x241b320 00:13:02.695 [2024-07-12 10:39:37.694742] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:02.695 [2024-07-12 10:39:37.694934] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2282d00 00:13:02.695 [2024-07-12 10:39:37.695083] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x241b320 00:13:02.695 [2024-07-12 10:39:37.695093] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x241b320 00:13:02.695 [2024-07-12 10:39:37.695202] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.695 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:02.953 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.953 "name": "raid_bdev1", 00:13:02.953 "uuid": "f5a003a6-4e1a-4e2d-a928-e09f9907bee1", 00:13:02.953 "strip_size_kb": 0, 00:13:02.953 "state": "online", 00:13:02.953 "raid_level": "raid1", 00:13:02.953 "superblock": true, 00:13:02.953 "num_base_bdevs": 2, 00:13:02.953 "num_base_bdevs_discovered": 2, 00:13:02.953 "num_base_bdevs_operational": 2, 00:13:02.953 "base_bdevs_list": [ 00:13:02.953 { 00:13:02.953 "name": "BaseBdev1", 00:13:02.953 "uuid": "158bc1d2-2209-5113-afdf-717ccec7ecd6", 00:13:02.953 "is_configured": true, 00:13:02.953 "data_offset": 2048, 00:13:02.953 "data_size": 63488 00:13:02.953 }, 00:13:02.953 { 00:13:02.953 "name": "BaseBdev2", 00:13:02.953 "uuid": "ce2cde1a-edc8-5be8-9d64-cbc78e08ad60", 00:13:02.953 "is_configured": true, 00:13:02.953 "data_offset": 2048, 00:13:02.953 "data_size": 63488 00:13:02.953 } 00:13:02.953 ] 00:13:02.953 }' 00:13:02.953 10:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.953 10:39:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:03.520 10:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:03.520 10:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:03.520 [2024-07-12 10:39:38.639910] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2416c70 00:13:04.455 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:04.714 [2024-07-12 10:39:39.771876] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:04.714 [2024-07-12 10:39:39.771938] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:04.714 [2024-07-12 10:39:39.772111] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2416c70 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.714 10:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:05.282 10:39:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.282 "name": "raid_bdev1", 00:13:05.282 "uuid": "f5a003a6-4e1a-4e2d-a928-e09f9907bee1", 00:13:05.282 "strip_size_kb": 0, 00:13:05.282 "state": "online", 00:13:05.282 "raid_level": "raid1", 00:13:05.282 "superblock": true, 00:13:05.282 "num_base_bdevs": 2, 00:13:05.282 "num_base_bdevs_discovered": 1, 00:13:05.282 "num_base_bdevs_operational": 1, 00:13:05.282 "base_bdevs_list": [ 00:13:05.282 { 00:13:05.282 "name": null, 00:13:05.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:05.282 "is_configured": false, 00:13:05.282 "data_offset": 2048, 00:13:05.282 "data_size": 63488 00:13:05.282 }, 00:13:05.282 { 00:13:05.282 "name": "BaseBdev2", 00:13:05.282 "uuid": "ce2cde1a-edc8-5be8-9d64-cbc78e08ad60", 00:13:05.282 "is_configured": true, 00:13:05.282 "data_offset": 2048, 00:13:05.282 "data_size": 63488 00:13:05.282 } 00:13:05.282 ] 00:13:05.282 }' 00:13:05.282 10:39:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.282 10:39:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:05.851 10:39:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:06.110 [2024-07-12 10:39:41.133969] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:06.110 [2024-07-12 10:39:41.134005] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:06.111 [2024-07-12 10:39:41.137126] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:06.111 [2024-07-12 10:39:41.137152] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:06.111 [2024-07-12 10:39:41.137205] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:06.111 [2024-07-12 10:39:41.137216] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x241b320 name raid_bdev1, state offline 00:13:06.111 0 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2034775 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2034775 ']' 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2034775 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2034775 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2034775' 00:13:06.111 killing process with pid 2034775 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2034775 00:13:06.111 [2024-07-12 10:39:41.200237] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:06.111 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2034775 00:13:06.111 [2024-07-12 10:39:41.210769] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.O0qT4gkDgV 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:06.370 00:13:06.370 real 0m6.496s 00:13:06.370 user 0m10.310s 00:13:06.370 sys 0m1.050s 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:06.370 10:39:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.370 ************************************ 00:13:06.370 END TEST raid_write_error_test 00:13:06.370 ************************************ 00:13:06.370 10:39:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:06.370 10:39:41 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:13:06.370 10:39:41 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:06.370 10:39:41 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:06.370 10:39:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:06.370 10:39:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:06.370 10:39:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:06.370 ************************************ 00:13:06.370 START TEST raid_state_function_test 00:13:06.370 ************************************ 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2035696 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2035696' 00:13:06.370 Process raid pid: 2035696 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2035696 /var/tmp/spdk-raid.sock 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2035696 ']' 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:06.370 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:06.370 10:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.642 [2024-07-12 10:39:41.628371] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:13:06.642 [2024-07-12 10:39:41.628516] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:06.642 [2024-07-12 10:39:41.825765] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:06.901 [2024-07-12 10:39:41.928999] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:06.901 [2024-07-12 10:39:41.993549] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:06.901 [2024-07-12 10:39:41.993582] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.468 10:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:07.468 10:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:07.468 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:07.726 [2024-07-12 10:39:42.736747] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:07.726 [2024-07-12 10:39:42.736786] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:07.726 [2024-07-12 10:39:42.736797] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:07.726 [2024-07-12 10:39:42.736809] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:07.726 [2024-07-12 10:39:42.736818] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:07.726 [2024-07-12 10:39:42.736833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.726 10:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:07.985 10:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:07.985 "name": "Existed_Raid", 00:13:07.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.985 "strip_size_kb": 64, 00:13:07.985 "state": "configuring", 00:13:07.985 "raid_level": "raid0", 00:13:07.985 "superblock": false, 00:13:07.985 "num_base_bdevs": 3, 00:13:07.985 "num_base_bdevs_discovered": 0, 00:13:07.985 "num_base_bdevs_operational": 3, 00:13:07.985 "base_bdevs_list": [ 00:13:07.985 { 00:13:07.985 "name": "BaseBdev1", 00:13:07.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.985 "is_configured": false, 00:13:07.985 "data_offset": 0, 00:13:07.985 "data_size": 0 00:13:07.985 }, 00:13:07.985 { 00:13:07.985 "name": "BaseBdev2", 00:13:07.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.985 "is_configured": false, 00:13:07.985 "data_offset": 0, 00:13:07.985 "data_size": 0 00:13:07.985 }, 00:13:07.985 { 00:13:07.985 "name": "BaseBdev3", 00:13:07.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.986 "is_configured": false, 00:13:07.986 "data_offset": 0, 00:13:07.986 "data_size": 0 00:13:07.986 } 00:13:07.986 ] 00:13:07.986 }' 00:13:07.986 10:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:07.986 10:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.552 10:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:08.552 [2024-07-12 10:39:43.735257] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:08.552 [2024-07-12 10:39:43.735288] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8e3a80 name Existed_Raid, state configuring 00:13:08.811 10:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:08.811 [2024-07-12 10:39:43.983928] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:08.811 [2024-07-12 10:39:43.983954] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:08.811 [2024-07-12 10:39:43.983963] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:08.811 [2024-07-12 10:39:43.983974] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:08.811 [2024-07-12 10:39:43.983983] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:08.811 [2024-07-12 10:39:43.983994] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:08.811 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:09.070 [2024-07-12 10:39:44.238475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:09.070 BaseBdev1 00:13:09.070 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:09.070 10:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:09.070 10:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:09.070 10:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:09.070 10:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:09.070 10:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:09.070 10:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:09.328 10:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:09.587 [ 00:13:09.587 { 00:13:09.587 "name": "BaseBdev1", 00:13:09.587 "aliases": [ 00:13:09.587 "b8b7b642-ac50-4482-b93c-2439877e4cc8" 00:13:09.587 ], 00:13:09.587 "product_name": "Malloc disk", 00:13:09.587 "block_size": 512, 00:13:09.587 "num_blocks": 65536, 00:13:09.587 "uuid": "b8b7b642-ac50-4482-b93c-2439877e4cc8", 00:13:09.587 "assigned_rate_limits": { 00:13:09.587 "rw_ios_per_sec": 0, 00:13:09.587 "rw_mbytes_per_sec": 0, 00:13:09.587 "r_mbytes_per_sec": 0, 00:13:09.587 "w_mbytes_per_sec": 0 00:13:09.587 }, 00:13:09.587 "claimed": true, 00:13:09.587 "claim_type": "exclusive_write", 00:13:09.587 "zoned": false, 00:13:09.587 "supported_io_types": { 00:13:09.587 "read": true, 00:13:09.587 "write": true, 00:13:09.587 "unmap": true, 00:13:09.587 "flush": true, 00:13:09.587 "reset": true, 00:13:09.587 "nvme_admin": false, 00:13:09.587 "nvme_io": false, 00:13:09.587 "nvme_io_md": false, 00:13:09.587 "write_zeroes": true, 00:13:09.587 "zcopy": true, 00:13:09.587 "get_zone_info": false, 00:13:09.587 "zone_management": false, 00:13:09.587 "zone_append": false, 00:13:09.587 "compare": false, 00:13:09.587 "compare_and_write": false, 00:13:09.587 "abort": true, 00:13:09.587 "seek_hole": false, 00:13:09.587 "seek_data": false, 00:13:09.587 "copy": true, 00:13:09.587 "nvme_iov_md": false 00:13:09.587 }, 00:13:09.587 "memory_domains": [ 00:13:09.587 { 00:13:09.587 "dma_device_id": "system", 00:13:09.587 "dma_device_type": 1 00:13:09.587 }, 00:13:09.587 { 00:13:09.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.587 "dma_device_type": 2 00:13:09.587 } 00:13:09.587 ], 00:13:09.587 "driver_specific": {} 00:13:09.587 } 00:13:09.587 ] 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.587 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.857 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.857 "name": "Existed_Raid", 00:13:09.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.857 "strip_size_kb": 64, 00:13:09.857 "state": "configuring", 00:13:09.857 "raid_level": "raid0", 00:13:09.857 "superblock": false, 00:13:09.857 "num_base_bdevs": 3, 00:13:09.857 "num_base_bdevs_discovered": 1, 00:13:09.857 "num_base_bdevs_operational": 3, 00:13:09.857 "base_bdevs_list": [ 00:13:09.857 { 00:13:09.857 "name": "BaseBdev1", 00:13:09.857 "uuid": "b8b7b642-ac50-4482-b93c-2439877e4cc8", 00:13:09.857 "is_configured": true, 00:13:09.857 "data_offset": 0, 00:13:09.857 "data_size": 65536 00:13:09.857 }, 00:13:09.857 { 00:13:09.857 "name": "BaseBdev2", 00:13:09.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.857 "is_configured": false, 00:13:09.857 "data_offset": 0, 00:13:09.857 "data_size": 0 00:13:09.857 }, 00:13:09.857 { 00:13:09.857 "name": "BaseBdev3", 00:13:09.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.857 "is_configured": false, 00:13:09.857 "data_offset": 0, 00:13:09.857 "data_size": 0 00:13:09.857 } 00:13:09.857 ] 00:13:09.857 }' 00:13:09.857 10:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.857 10:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.431 10:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:10.689 [2024-07-12 10:39:45.834727] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:10.689 [2024-07-12 10:39:45.834772] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8e3310 name Existed_Raid, state configuring 00:13:10.689 10:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:10.948 [2024-07-12 10:39:46.083409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:10.948 [2024-07-12 10:39:46.084840] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:10.948 [2024-07-12 10:39:46.084872] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:10.948 [2024-07-12 10:39:46.084882] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:10.948 [2024-07-12 10:39:46.084893] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.948 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.206 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.206 "name": "Existed_Raid", 00:13:11.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.206 "strip_size_kb": 64, 00:13:11.206 "state": "configuring", 00:13:11.206 "raid_level": "raid0", 00:13:11.206 "superblock": false, 00:13:11.206 "num_base_bdevs": 3, 00:13:11.206 "num_base_bdevs_discovered": 1, 00:13:11.206 "num_base_bdevs_operational": 3, 00:13:11.206 "base_bdevs_list": [ 00:13:11.206 { 00:13:11.206 "name": "BaseBdev1", 00:13:11.206 "uuid": "b8b7b642-ac50-4482-b93c-2439877e4cc8", 00:13:11.206 "is_configured": true, 00:13:11.206 "data_offset": 0, 00:13:11.206 "data_size": 65536 00:13:11.206 }, 00:13:11.206 { 00:13:11.206 "name": "BaseBdev2", 00:13:11.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.206 "is_configured": false, 00:13:11.206 "data_offset": 0, 00:13:11.206 "data_size": 0 00:13:11.206 }, 00:13:11.206 { 00:13:11.206 "name": "BaseBdev3", 00:13:11.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.206 "is_configured": false, 00:13:11.206 "data_offset": 0, 00:13:11.206 "data_size": 0 00:13:11.206 } 00:13:11.206 ] 00:13:11.206 }' 00:13:11.206 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.206 10:39:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.774 10:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:12.033 [2024-07-12 10:39:47.154818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:12.033 BaseBdev2 00:13:12.033 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:12.033 10:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:12.033 10:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:12.033 10:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:12.033 10:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:12.033 10:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:12.033 10:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:12.291 10:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:12.551 [ 00:13:12.551 { 00:13:12.551 "name": "BaseBdev2", 00:13:12.551 "aliases": [ 00:13:12.551 "3052383f-c771-43d7-a64d-0b95a5705f69" 00:13:12.551 ], 00:13:12.551 "product_name": "Malloc disk", 00:13:12.551 "block_size": 512, 00:13:12.551 "num_blocks": 65536, 00:13:12.551 "uuid": "3052383f-c771-43d7-a64d-0b95a5705f69", 00:13:12.551 "assigned_rate_limits": { 00:13:12.551 "rw_ios_per_sec": 0, 00:13:12.551 "rw_mbytes_per_sec": 0, 00:13:12.551 "r_mbytes_per_sec": 0, 00:13:12.551 "w_mbytes_per_sec": 0 00:13:12.551 }, 00:13:12.551 "claimed": true, 00:13:12.551 "claim_type": "exclusive_write", 00:13:12.551 "zoned": false, 00:13:12.551 "supported_io_types": { 00:13:12.551 "read": true, 00:13:12.551 "write": true, 00:13:12.551 "unmap": true, 00:13:12.551 "flush": true, 00:13:12.551 "reset": true, 00:13:12.551 "nvme_admin": false, 00:13:12.551 "nvme_io": false, 00:13:12.551 "nvme_io_md": false, 00:13:12.551 "write_zeroes": true, 00:13:12.551 "zcopy": true, 00:13:12.551 "get_zone_info": false, 00:13:12.551 "zone_management": false, 00:13:12.551 "zone_append": false, 00:13:12.551 "compare": false, 00:13:12.551 "compare_and_write": false, 00:13:12.551 "abort": true, 00:13:12.551 "seek_hole": false, 00:13:12.551 "seek_data": false, 00:13:12.551 "copy": true, 00:13:12.551 "nvme_iov_md": false 00:13:12.551 }, 00:13:12.551 "memory_domains": [ 00:13:12.551 { 00:13:12.551 "dma_device_id": "system", 00:13:12.551 "dma_device_type": 1 00:13:12.551 }, 00:13:12.551 { 00:13:12.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.551 "dma_device_type": 2 00:13:12.551 } 00:13:12.551 ], 00:13:12.551 "driver_specific": {} 00:13:12.551 } 00:13:12.551 ] 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.551 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.810 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.810 "name": "Existed_Raid", 00:13:12.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.810 "strip_size_kb": 64, 00:13:12.810 "state": "configuring", 00:13:12.810 "raid_level": "raid0", 00:13:12.810 "superblock": false, 00:13:12.810 "num_base_bdevs": 3, 00:13:12.810 "num_base_bdevs_discovered": 2, 00:13:12.810 "num_base_bdevs_operational": 3, 00:13:12.810 "base_bdevs_list": [ 00:13:12.810 { 00:13:12.810 "name": "BaseBdev1", 00:13:12.810 "uuid": "b8b7b642-ac50-4482-b93c-2439877e4cc8", 00:13:12.810 "is_configured": true, 00:13:12.810 "data_offset": 0, 00:13:12.810 "data_size": 65536 00:13:12.810 }, 00:13:12.810 { 00:13:12.810 "name": "BaseBdev2", 00:13:12.810 "uuid": "3052383f-c771-43d7-a64d-0b95a5705f69", 00:13:12.810 "is_configured": true, 00:13:12.810 "data_offset": 0, 00:13:12.810 "data_size": 65536 00:13:12.810 }, 00:13:12.810 { 00:13:12.810 "name": "BaseBdev3", 00:13:12.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.810 "is_configured": false, 00:13:12.810 "data_offset": 0, 00:13:12.810 "data_size": 0 00:13:12.810 } 00:13:12.810 ] 00:13:12.810 }' 00:13:12.810 10:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.810 10:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.378 10:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:13.637 [2024-07-12 10:39:48.694312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:13.637 [2024-07-12 10:39:48.694347] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8e4400 00:13:13.637 [2024-07-12 10:39:48.694356] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:13.637 [2024-07-12 10:39:48.694609] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8e3ef0 00:13:13.637 [2024-07-12 10:39:48.694726] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8e4400 00:13:13.637 [2024-07-12 10:39:48.694737] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8e4400 00:13:13.637 [2024-07-12 10:39:48.694895] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:13.637 BaseBdev3 00:13:13.637 10:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:13.637 10:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:13.637 10:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:13.637 10:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:13.637 10:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:13.637 10:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:13.637 10:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:13.896 10:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:14.155 [ 00:13:14.155 { 00:13:14.155 "name": "BaseBdev3", 00:13:14.155 "aliases": [ 00:13:14.155 "de5672d2-c5f5-4fa5-8819-cf9ef3ab5c2d" 00:13:14.155 ], 00:13:14.155 "product_name": "Malloc disk", 00:13:14.155 "block_size": 512, 00:13:14.155 "num_blocks": 65536, 00:13:14.155 "uuid": "de5672d2-c5f5-4fa5-8819-cf9ef3ab5c2d", 00:13:14.155 "assigned_rate_limits": { 00:13:14.155 "rw_ios_per_sec": 0, 00:13:14.155 "rw_mbytes_per_sec": 0, 00:13:14.155 "r_mbytes_per_sec": 0, 00:13:14.155 "w_mbytes_per_sec": 0 00:13:14.155 }, 00:13:14.155 "claimed": true, 00:13:14.155 "claim_type": "exclusive_write", 00:13:14.155 "zoned": false, 00:13:14.155 "supported_io_types": { 00:13:14.155 "read": true, 00:13:14.155 "write": true, 00:13:14.155 "unmap": true, 00:13:14.155 "flush": true, 00:13:14.155 "reset": true, 00:13:14.155 "nvme_admin": false, 00:13:14.155 "nvme_io": false, 00:13:14.155 "nvme_io_md": false, 00:13:14.155 "write_zeroes": true, 00:13:14.155 "zcopy": true, 00:13:14.155 "get_zone_info": false, 00:13:14.155 "zone_management": false, 00:13:14.155 "zone_append": false, 00:13:14.155 "compare": false, 00:13:14.155 "compare_and_write": false, 00:13:14.155 "abort": true, 00:13:14.155 "seek_hole": false, 00:13:14.155 "seek_data": false, 00:13:14.155 "copy": true, 00:13:14.155 "nvme_iov_md": false 00:13:14.155 }, 00:13:14.155 "memory_domains": [ 00:13:14.155 { 00:13:14.155 "dma_device_id": "system", 00:13:14.155 "dma_device_type": 1 00:13:14.155 }, 00:13:14.155 { 00:13:14.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.155 "dma_device_type": 2 00:13:14.156 } 00:13:14.156 ], 00:13:14.156 "driver_specific": {} 00:13:14.156 } 00:13:14.156 ] 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.156 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:14.415 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:14.415 "name": "Existed_Raid", 00:13:14.415 "uuid": "be2add25-4c13-4e13-b508-2ae714448fcd", 00:13:14.415 "strip_size_kb": 64, 00:13:14.415 "state": "online", 00:13:14.415 "raid_level": "raid0", 00:13:14.415 "superblock": false, 00:13:14.415 "num_base_bdevs": 3, 00:13:14.415 "num_base_bdevs_discovered": 3, 00:13:14.415 "num_base_bdevs_operational": 3, 00:13:14.415 "base_bdevs_list": [ 00:13:14.415 { 00:13:14.415 "name": "BaseBdev1", 00:13:14.415 "uuid": "b8b7b642-ac50-4482-b93c-2439877e4cc8", 00:13:14.415 "is_configured": true, 00:13:14.415 "data_offset": 0, 00:13:14.415 "data_size": 65536 00:13:14.415 }, 00:13:14.415 { 00:13:14.415 "name": "BaseBdev2", 00:13:14.415 "uuid": "3052383f-c771-43d7-a64d-0b95a5705f69", 00:13:14.415 "is_configured": true, 00:13:14.415 "data_offset": 0, 00:13:14.415 "data_size": 65536 00:13:14.415 }, 00:13:14.415 { 00:13:14.415 "name": "BaseBdev3", 00:13:14.415 "uuid": "de5672d2-c5f5-4fa5-8819-cf9ef3ab5c2d", 00:13:14.415 "is_configured": true, 00:13:14.415 "data_offset": 0, 00:13:14.415 "data_size": 65536 00:13:14.415 } 00:13:14.415 ] 00:13:14.415 }' 00:13:14.415 10:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:14.415 10:39:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.984 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:14.984 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:14.984 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:14.984 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:14.984 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:14.984 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:14.984 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:14.984 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:15.244 [2024-07-12 10:39:50.194638] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:15.244 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:15.244 "name": "Existed_Raid", 00:13:15.244 "aliases": [ 00:13:15.244 "be2add25-4c13-4e13-b508-2ae714448fcd" 00:13:15.244 ], 00:13:15.244 "product_name": "Raid Volume", 00:13:15.244 "block_size": 512, 00:13:15.244 "num_blocks": 196608, 00:13:15.244 "uuid": "be2add25-4c13-4e13-b508-2ae714448fcd", 00:13:15.244 "assigned_rate_limits": { 00:13:15.244 "rw_ios_per_sec": 0, 00:13:15.244 "rw_mbytes_per_sec": 0, 00:13:15.244 "r_mbytes_per_sec": 0, 00:13:15.244 "w_mbytes_per_sec": 0 00:13:15.244 }, 00:13:15.244 "claimed": false, 00:13:15.244 "zoned": false, 00:13:15.244 "supported_io_types": { 00:13:15.244 "read": true, 00:13:15.244 "write": true, 00:13:15.244 "unmap": true, 00:13:15.244 "flush": true, 00:13:15.244 "reset": true, 00:13:15.244 "nvme_admin": false, 00:13:15.244 "nvme_io": false, 00:13:15.244 "nvme_io_md": false, 00:13:15.244 "write_zeroes": true, 00:13:15.244 "zcopy": false, 00:13:15.244 "get_zone_info": false, 00:13:15.244 "zone_management": false, 00:13:15.244 "zone_append": false, 00:13:15.244 "compare": false, 00:13:15.244 "compare_and_write": false, 00:13:15.244 "abort": false, 00:13:15.244 "seek_hole": false, 00:13:15.244 "seek_data": false, 00:13:15.244 "copy": false, 00:13:15.244 "nvme_iov_md": false 00:13:15.244 }, 00:13:15.244 "memory_domains": [ 00:13:15.244 { 00:13:15.244 "dma_device_id": "system", 00:13:15.244 "dma_device_type": 1 00:13:15.244 }, 00:13:15.244 { 00:13:15.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.244 "dma_device_type": 2 00:13:15.244 }, 00:13:15.244 { 00:13:15.244 "dma_device_id": "system", 00:13:15.244 "dma_device_type": 1 00:13:15.244 }, 00:13:15.244 { 00:13:15.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.244 "dma_device_type": 2 00:13:15.244 }, 00:13:15.244 { 00:13:15.244 "dma_device_id": "system", 00:13:15.244 "dma_device_type": 1 00:13:15.244 }, 00:13:15.244 { 00:13:15.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.244 "dma_device_type": 2 00:13:15.244 } 00:13:15.244 ], 00:13:15.244 "driver_specific": { 00:13:15.244 "raid": { 00:13:15.244 "uuid": "be2add25-4c13-4e13-b508-2ae714448fcd", 00:13:15.244 "strip_size_kb": 64, 00:13:15.244 "state": "online", 00:13:15.244 "raid_level": "raid0", 00:13:15.244 "superblock": false, 00:13:15.244 "num_base_bdevs": 3, 00:13:15.244 "num_base_bdevs_discovered": 3, 00:13:15.244 "num_base_bdevs_operational": 3, 00:13:15.244 "base_bdevs_list": [ 00:13:15.244 { 00:13:15.244 "name": "BaseBdev1", 00:13:15.244 "uuid": "b8b7b642-ac50-4482-b93c-2439877e4cc8", 00:13:15.244 "is_configured": true, 00:13:15.244 "data_offset": 0, 00:13:15.244 "data_size": 65536 00:13:15.244 }, 00:13:15.244 { 00:13:15.244 "name": "BaseBdev2", 00:13:15.244 "uuid": "3052383f-c771-43d7-a64d-0b95a5705f69", 00:13:15.244 "is_configured": true, 00:13:15.244 "data_offset": 0, 00:13:15.244 "data_size": 65536 00:13:15.244 }, 00:13:15.244 { 00:13:15.244 "name": "BaseBdev3", 00:13:15.244 "uuid": "de5672d2-c5f5-4fa5-8819-cf9ef3ab5c2d", 00:13:15.244 "is_configured": true, 00:13:15.244 "data_offset": 0, 00:13:15.244 "data_size": 65536 00:13:15.244 } 00:13:15.244 ] 00:13:15.244 } 00:13:15.244 } 00:13:15.244 }' 00:13:15.244 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:15.244 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:15.244 BaseBdev2 00:13:15.244 BaseBdev3' 00:13:15.244 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.244 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:15.244 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.503 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.504 "name": "BaseBdev1", 00:13:15.504 "aliases": [ 00:13:15.504 "b8b7b642-ac50-4482-b93c-2439877e4cc8" 00:13:15.504 ], 00:13:15.504 "product_name": "Malloc disk", 00:13:15.504 "block_size": 512, 00:13:15.504 "num_blocks": 65536, 00:13:15.504 "uuid": "b8b7b642-ac50-4482-b93c-2439877e4cc8", 00:13:15.504 "assigned_rate_limits": { 00:13:15.504 "rw_ios_per_sec": 0, 00:13:15.504 "rw_mbytes_per_sec": 0, 00:13:15.504 "r_mbytes_per_sec": 0, 00:13:15.504 "w_mbytes_per_sec": 0 00:13:15.504 }, 00:13:15.504 "claimed": true, 00:13:15.504 "claim_type": "exclusive_write", 00:13:15.504 "zoned": false, 00:13:15.504 "supported_io_types": { 00:13:15.504 "read": true, 00:13:15.504 "write": true, 00:13:15.504 "unmap": true, 00:13:15.504 "flush": true, 00:13:15.504 "reset": true, 00:13:15.504 "nvme_admin": false, 00:13:15.504 "nvme_io": false, 00:13:15.504 "nvme_io_md": false, 00:13:15.504 "write_zeroes": true, 00:13:15.504 "zcopy": true, 00:13:15.504 "get_zone_info": false, 00:13:15.504 "zone_management": false, 00:13:15.504 "zone_append": false, 00:13:15.504 "compare": false, 00:13:15.504 "compare_and_write": false, 00:13:15.504 "abort": true, 00:13:15.504 "seek_hole": false, 00:13:15.504 "seek_data": false, 00:13:15.504 "copy": true, 00:13:15.504 "nvme_iov_md": false 00:13:15.504 }, 00:13:15.504 "memory_domains": [ 00:13:15.504 { 00:13:15.504 "dma_device_id": "system", 00:13:15.504 "dma_device_type": 1 00:13:15.504 }, 00:13:15.504 { 00:13:15.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.504 "dma_device_type": 2 00:13:15.504 } 00:13:15.504 ], 00:13:15.504 "driver_specific": {} 00:13:15.504 }' 00:13:15.504 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.504 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.504 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.504 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.504 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.504 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.504 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.763 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.763 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.763 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.763 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.763 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.763 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.763 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.763 10:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:16.022 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:16.022 "name": "BaseBdev2", 00:13:16.022 "aliases": [ 00:13:16.022 "3052383f-c771-43d7-a64d-0b95a5705f69" 00:13:16.022 ], 00:13:16.022 "product_name": "Malloc disk", 00:13:16.022 "block_size": 512, 00:13:16.022 "num_blocks": 65536, 00:13:16.022 "uuid": "3052383f-c771-43d7-a64d-0b95a5705f69", 00:13:16.022 "assigned_rate_limits": { 00:13:16.022 "rw_ios_per_sec": 0, 00:13:16.022 "rw_mbytes_per_sec": 0, 00:13:16.022 "r_mbytes_per_sec": 0, 00:13:16.022 "w_mbytes_per_sec": 0 00:13:16.022 }, 00:13:16.022 "claimed": true, 00:13:16.022 "claim_type": "exclusive_write", 00:13:16.022 "zoned": false, 00:13:16.022 "supported_io_types": { 00:13:16.022 "read": true, 00:13:16.022 "write": true, 00:13:16.022 "unmap": true, 00:13:16.022 "flush": true, 00:13:16.022 "reset": true, 00:13:16.022 "nvme_admin": false, 00:13:16.022 "nvme_io": false, 00:13:16.022 "nvme_io_md": false, 00:13:16.022 "write_zeroes": true, 00:13:16.022 "zcopy": true, 00:13:16.022 "get_zone_info": false, 00:13:16.022 "zone_management": false, 00:13:16.022 "zone_append": false, 00:13:16.022 "compare": false, 00:13:16.022 "compare_and_write": false, 00:13:16.022 "abort": true, 00:13:16.022 "seek_hole": false, 00:13:16.022 "seek_data": false, 00:13:16.022 "copy": true, 00:13:16.022 "nvme_iov_md": false 00:13:16.022 }, 00:13:16.022 "memory_domains": [ 00:13:16.022 { 00:13:16.022 "dma_device_id": "system", 00:13:16.022 "dma_device_type": 1 00:13:16.022 }, 00:13:16.022 { 00:13:16.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.022 "dma_device_type": 2 00:13:16.022 } 00:13:16.022 ], 00:13:16.022 "driver_specific": {} 00:13:16.022 }' 00:13:16.022 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.022 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.022 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:16.022 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:16.281 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:16.539 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:16.539 "name": "BaseBdev3", 00:13:16.539 "aliases": [ 00:13:16.539 "de5672d2-c5f5-4fa5-8819-cf9ef3ab5c2d" 00:13:16.539 ], 00:13:16.539 "product_name": "Malloc disk", 00:13:16.539 "block_size": 512, 00:13:16.539 "num_blocks": 65536, 00:13:16.539 "uuid": "de5672d2-c5f5-4fa5-8819-cf9ef3ab5c2d", 00:13:16.539 "assigned_rate_limits": { 00:13:16.539 "rw_ios_per_sec": 0, 00:13:16.539 "rw_mbytes_per_sec": 0, 00:13:16.539 "r_mbytes_per_sec": 0, 00:13:16.539 "w_mbytes_per_sec": 0 00:13:16.539 }, 00:13:16.539 "claimed": true, 00:13:16.539 "claim_type": "exclusive_write", 00:13:16.539 "zoned": false, 00:13:16.539 "supported_io_types": { 00:13:16.539 "read": true, 00:13:16.539 "write": true, 00:13:16.539 "unmap": true, 00:13:16.539 "flush": true, 00:13:16.539 "reset": true, 00:13:16.539 "nvme_admin": false, 00:13:16.539 "nvme_io": false, 00:13:16.539 "nvme_io_md": false, 00:13:16.539 "write_zeroes": true, 00:13:16.539 "zcopy": true, 00:13:16.539 "get_zone_info": false, 00:13:16.539 "zone_management": false, 00:13:16.539 "zone_append": false, 00:13:16.539 "compare": false, 00:13:16.539 "compare_and_write": false, 00:13:16.539 "abort": true, 00:13:16.539 "seek_hole": false, 00:13:16.539 "seek_data": false, 00:13:16.539 "copy": true, 00:13:16.539 "nvme_iov_md": false 00:13:16.539 }, 00:13:16.539 "memory_domains": [ 00:13:16.539 { 00:13:16.539 "dma_device_id": "system", 00:13:16.539 "dma_device_type": 1 00:13:16.539 }, 00:13:16.539 { 00:13:16.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.539 "dma_device_type": 2 00:13:16.539 } 00:13:16.539 ], 00:13:16.539 "driver_specific": {} 00:13:16.539 }' 00:13:16.539 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.797 10:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.055 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:17.055 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:17.315 [2024-07-12 10:39:52.263885] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:17.315 [2024-07-12 10:39:52.263910] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:17.315 [2024-07-12 10:39:52.263950] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.315 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:17.883 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:17.883 "name": "Existed_Raid", 00:13:17.883 "uuid": "be2add25-4c13-4e13-b508-2ae714448fcd", 00:13:17.883 "strip_size_kb": 64, 00:13:17.883 "state": "offline", 00:13:17.883 "raid_level": "raid0", 00:13:17.883 "superblock": false, 00:13:17.883 "num_base_bdevs": 3, 00:13:17.883 "num_base_bdevs_discovered": 2, 00:13:17.883 "num_base_bdevs_operational": 2, 00:13:17.883 "base_bdevs_list": [ 00:13:17.883 { 00:13:17.883 "name": null, 00:13:17.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.883 "is_configured": false, 00:13:17.883 "data_offset": 0, 00:13:17.883 "data_size": 65536 00:13:17.883 }, 00:13:17.883 { 00:13:17.883 "name": "BaseBdev2", 00:13:17.883 "uuid": "3052383f-c771-43d7-a64d-0b95a5705f69", 00:13:17.883 "is_configured": true, 00:13:17.883 "data_offset": 0, 00:13:17.883 "data_size": 65536 00:13:17.883 }, 00:13:17.883 { 00:13:17.883 "name": "BaseBdev3", 00:13:17.883 "uuid": "de5672d2-c5f5-4fa5-8819-cf9ef3ab5c2d", 00:13:17.883 "is_configured": true, 00:13:17.883 "data_offset": 0, 00:13:17.883 "data_size": 65536 00:13:17.883 } 00:13:17.883 ] 00:13:17.883 }' 00:13:17.883 10:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:17.883 10:39:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.141 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:18.141 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:18.141 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.141 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:18.401 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:18.401 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:18.401 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:18.660 [2024-07-12 10:39:53.704733] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:18.660 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:18.660 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:18.660 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.660 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:18.919 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:18.919 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:18.919 10:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:19.179 [2024-07-12 10:39:54.210544] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:19.179 [2024-07-12 10:39:54.210588] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8e4400 name Existed_Raid, state offline 00:13:19.179 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:19.179 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:19.179 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.179 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:19.438 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:19.438 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:19.438 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:19.438 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:19.438 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:19.438 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:19.697 BaseBdev2 00:13:19.697 10:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:19.697 10:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:19.697 10:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:19.697 10:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:19.697 10:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:19.697 10:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:19.697 10:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:19.957 10:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:20.216 [ 00:13:20.216 { 00:13:20.216 "name": "BaseBdev2", 00:13:20.216 "aliases": [ 00:13:20.216 "97e1b141-067f-4e47-a3a3-3e087bb5f759" 00:13:20.216 ], 00:13:20.216 "product_name": "Malloc disk", 00:13:20.216 "block_size": 512, 00:13:20.216 "num_blocks": 65536, 00:13:20.216 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:20.216 "assigned_rate_limits": { 00:13:20.216 "rw_ios_per_sec": 0, 00:13:20.216 "rw_mbytes_per_sec": 0, 00:13:20.216 "r_mbytes_per_sec": 0, 00:13:20.216 "w_mbytes_per_sec": 0 00:13:20.216 }, 00:13:20.216 "claimed": false, 00:13:20.216 "zoned": false, 00:13:20.216 "supported_io_types": { 00:13:20.216 "read": true, 00:13:20.216 "write": true, 00:13:20.216 "unmap": true, 00:13:20.216 "flush": true, 00:13:20.216 "reset": true, 00:13:20.216 "nvme_admin": false, 00:13:20.216 "nvme_io": false, 00:13:20.216 "nvme_io_md": false, 00:13:20.216 "write_zeroes": true, 00:13:20.216 "zcopy": true, 00:13:20.216 "get_zone_info": false, 00:13:20.216 "zone_management": false, 00:13:20.216 "zone_append": false, 00:13:20.216 "compare": false, 00:13:20.216 "compare_and_write": false, 00:13:20.216 "abort": true, 00:13:20.216 "seek_hole": false, 00:13:20.216 "seek_data": false, 00:13:20.216 "copy": true, 00:13:20.216 "nvme_iov_md": false 00:13:20.216 }, 00:13:20.216 "memory_domains": [ 00:13:20.216 { 00:13:20.216 "dma_device_id": "system", 00:13:20.216 "dma_device_type": 1 00:13:20.216 }, 00:13:20.216 { 00:13:20.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.216 "dma_device_type": 2 00:13:20.216 } 00:13:20.216 ], 00:13:20.216 "driver_specific": {} 00:13:20.216 } 00:13:20.216 ] 00:13:20.216 10:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:20.216 10:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:20.216 10:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:20.216 10:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:20.476 BaseBdev3 00:13:20.476 10:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:20.476 10:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:20.476 10:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:20.476 10:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:20.476 10:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:20.476 10:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:20.476 10:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:20.735 10:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:20.994 [ 00:13:20.994 { 00:13:20.994 "name": "BaseBdev3", 00:13:20.994 "aliases": [ 00:13:20.994 "1df8cc08-66f7-4c80-9dbb-e56e46080e22" 00:13:20.994 ], 00:13:20.994 "product_name": "Malloc disk", 00:13:20.994 "block_size": 512, 00:13:20.994 "num_blocks": 65536, 00:13:20.994 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:20.994 "assigned_rate_limits": { 00:13:20.994 "rw_ios_per_sec": 0, 00:13:20.994 "rw_mbytes_per_sec": 0, 00:13:20.994 "r_mbytes_per_sec": 0, 00:13:20.994 "w_mbytes_per_sec": 0 00:13:20.994 }, 00:13:20.994 "claimed": false, 00:13:20.994 "zoned": false, 00:13:20.994 "supported_io_types": { 00:13:20.994 "read": true, 00:13:20.994 "write": true, 00:13:20.994 "unmap": true, 00:13:20.994 "flush": true, 00:13:20.994 "reset": true, 00:13:20.994 "nvme_admin": false, 00:13:20.994 "nvme_io": false, 00:13:20.995 "nvme_io_md": false, 00:13:20.995 "write_zeroes": true, 00:13:20.995 "zcopy": true, 00:13:20.995 "get_zone_info": false, 00:13:20.995 "zone_management": false, 00:13:20.995 "zone_append": false, 00:13:20.995 "compare": false, 00:13:20.995 "compare_and_write": false, 00:13:20.995 "abort": true, 00:13:20.995 "seek_hole": false, 00:13:20.995 "seek_data": false, 00:13:20.995 "copy": true, 00:13:20.995 "nvme_iov_md": false 00:13:20.995 }, 00:13:20.995 "memory_domains": [ 00:13:20.995 { 00:13:20.995 "dma_device_id": "system", 00:13:20.995 "dma_device_type": 1 00:13:20.995 }, 00:13:20.995 { 00:13:20.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.995 "dma_device_type": 2 00:13:20.995 } 00:13:20.995 ], 00:13:20.995 "driver_specific": {} 00:13:20.995 } 00:13:20.995 ] 00:13:20.995 10:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:20.995 10:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:20.995 10:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:20.995 10:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:21.254 [2024-07-12 10:39:56.196102] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:21.254 [2024-07-12 10:39:56.196142] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:21.254 [2024-07-12 10:39:56.196160] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:21.254 [2024-07-12 10:39:56.197466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.254 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.513 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.513 "name": "Existed_Raid", 00:13:21.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.513 "strip_size_kb": 64, 00:13:21.513 "state": "configuring", 00:13:21.513 "raid_level": "raid0", 00:13:21.513 "superblock": false, 00:13:21.513 "num_base_bdevs": 3, 00:13:21.513 "num_base_bdevs_discovered": 2, 00:13:21.513 "num_base_bdevs_operational": 3, 00:13:21.513 "base_bdevs_list": [ 00:13:21.513 { 00:13:21.513 "name": "BaseBdev1", 00:13:21.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.513 "is_configured": false, 00:13:21.513 "data_offset": 0, 00:13:21.513 "data_size": 0 00:13:21.513 }, 00:13:21.513 { 00:13:21.513 "name": "BaseBdev2", 00:13:21.513 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:21.513 "is_configured": true, 00:13:21.513 "data_offset": 0, 00:13:21.513 "data_size": 65536 00:13:21.513 }, 00:13:21.513 { 00:13:21.513 "name": "BaseBdev3", 00:13:21.514 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:21.514 "is_configured": true, 00:13:21.514 "data_offset": 0, 00:13:21.514 "data_size": 65536 00:13:21.514 } 00:13:21.514 ] 00:13:21.514 }' 00:13:21.514 10:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.514 10:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.081 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:22.340 [2024-07-12 10:39:57.303024] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:22.340 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.599 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.599 "name": "Existed_Raid", 00:13:22.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:22.599 "strip_size_kb": 64, 00:13:22.599 "state": "configuring", 00:13:22.599 "raid_level": "raid0", 00:13:22.599 "superblock": false, 00:13:22.599 "num_base_bdevs": 3, 00:13:22.599 "num_base_bdevs_discovered": 1, 00:13:22.599 "num_base_bdevs_operational": 3, 00:13:22.599 "base_bdevs_list": [ 00:13:22.599 { 00:13:22.599 "name": "BaseBdev1", 00:13:22.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:22.599 "is_configured": false, 00:13:22.599 "data_offset": 0, 00:13:22.599 "data_size": 0 00:13:22.599 }, 00:13:22.599 { 00:13:22.599 "name": null, 00:13:22.599 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:22.599 "is_configured": false, 00:13:22.599 "data_offset": 0, 00:13:22.599 "data_size": 65536 00:13:22.599 }, 00:13:22.599 { 00:13:22.599 "name": "BaseBdev3", 00:13:22.599 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:22.599 "is_configured": true, 00:13:22.599 "data_offset": 0, 00:13:22.599 "data_size": 65536 00:13:22.599 } 00:13:22.599 ] 00:13:22.599 }' 00:13:22.599 10:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.599 10:39:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.168 10:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.168 10:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:23.427 10:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:23.427 10:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:23.686 [2024-07-12 10:39:58.686072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:23.686 BaseBdev1 00:13:23.686 10:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:23.686 10:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:23.686 10:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:23.686 10:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:23.686 10:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:23.686 10:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:23.686 10:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:23.984 10:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:24.244 [ 00:13:24.244 { 00:13:24.244 "name": "BaseBdev1", 00:13:24.244 "aliases": [ 00:13:24.244 "49d11b7f-fa79-489a-977b-3c076a6d8deb" 00:13:24.244 ], 00:13:24.244 "product_name": "Malloc disk", 00:13:24.244 "block_size": 512, 00:13:24.244 "num_blocks": 65536, 00:13:24.244 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:24.244 "assigned_rate_limits": { 00:13:24.244 "rw_ios_per_sec": 0, 00:13:24.244 "rw_mbytes_per_sec": 0, 00:13:24.244 "r_mbytes_per_sec": 0, 00:13:24.244 "w_mbytes_per_sec": 0 00:13:24.244 }, 00:13:24.244 "claimed": true, 00:13:24.244 "claim_type": "exclusive_write", 00:13:24.244 "zoned": false, 00:13:24.244 "supported_io_types": { 00:13:24.244 "read": true, 00:13:24.244 "write": true, 00:13:24.244 "unmap": true, 00:13:24.244 "flush": true, 00:13:24.244 "reset": true, 00:13:24.244 "nvme_admin": false, 00:13:24.244 "nvme_io": false, 00:13:24.244 "nvme_io_md": false, 00:13:24.244 "write_zeroes": true, 00:13:24.244 "zcopy": true, 00:13:24.244 "get_zone_info": false, 00:13:24.244 "zone_management": false, 00:13:24.244 "zone_append": false, 00:13:24.244 "compare": false, 00:13:24.244 "compare_and_write": false, 00:13:24.244 "abort": true, 00:13:24.244 "seek_hole": false, 00:13:24.244 "seek_data": false, 00:13:24.244 "copy": true, 00:13:24.244 "nvme_iov_md": false 00:13:24.244 }, 00:13:24.244 "memory_domains": [ 00:13:24.244 { 00:13:24.244 "dma_device_id": "system", 00:13:24.244 "dma_device_type": 1 00:13:24.244 }, 00:13:24.244 { 00:13:24.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.244 "dma_device_type": 2 00:13:24.244 } 00:13:24.244 ], 00:13:24.244 "driver_specific": {} 00:13:24.244 } 00:13:24.244 ] 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.244 "name": "Existed_Raid", 00:13:24.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.244 "strip_size_kb": 64, 00:13:24.244 "state": "configuring", 00:13:24.244 "raid_level": "raid0", 00:13:24.244 "superblock": false, 00:13:24.244 "num_base_bdevs": 3, 00:13:24.244 "num_base_bdevs_discovered": 2, 00:13:24.244 "num_base_bdevs_operational": 3, 00:13:24.244 "base_bdevs_list": [ 00:13:24.244 { 00:13:24.244 "name": "BaseBdev1", 00:13:24.244 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:24.244 "is_configured": true, 00:13:24.244 "data_offset": 0, 00:13:24.244 "data_size": 65536 00:13:24.244 }, 00:13:24.244 { 00:13:24.244 "name": null, 00:13:24.244 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:24.244 "is_configured": false, 00:13:24.244 "data_offset": 0, 00:13:24.244 "data_size": 65536 00:13:24.244 }, 00:13:24.244 { 00:13:24.244 "name": "BaseBdev3", 00:13:24.244 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:24.244 "is_configured": true, 00:13:24.244 "data_offset": 0, 00:13:24.244 "data_size": 65536 00:13:24.244 } 00:13:24.244 ] 00:13:24.244 }' 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.244 10:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.810 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.810 10:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:25.067 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:25.067 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:25.326 [2024-07-12 10:40:00.418692] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.326 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.585 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.585 "name": "Existed_Raid", 00:13:25.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.585 "strip_size_kb": 64, 00:13:25.585 "state": "configuring", 00:13:25.585 "raid_level": "raid0", 00:13:25.585 "superblock": false, 00:13:25.585 "num_base_bdevs": 3, 00:13:25.585 "num_base_bdevs_discovered": 1, 00:13:25.585 "num_base_bdevs_operational": 3, 00:13:25.585 "base_bdevs_list": [ 00:13:25.585 { 00:13:25.585 "name": "BaseBdev1", 00:13:25.585 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:25.585 "is_configured": true, 00:13:25.585 "data_offset": 0, 00:13:25.585 "data_size": 65536 00:13:25.585 }, 00:13:25.585 { 00:13:25.585 "name": null, 00:13:25.585 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:25.585 "is_configured": false, 00:13:25.585 "data_offset": 0, 00:13:25.585 "data_size": 65536 00:13:25.585 }, 00:13:25.585 { 00:13:25.585 "name": null, 00:13:25.585 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:25.585 "is_configured": false, 00:13:25.585 "data_offset": 0, 00:13:25.585 "data_size": 65536 00:13:25.585 } 00:13:25.585 ] 00:13:25.586 }' 00:13:25.586 10:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.586 10:40:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.153 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.153 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:26.412 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:26.412 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:26.668 [2024-07-12 10:40:01.613881] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.668 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.925 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.925 "name": "Existed_Raid", 00:13:26.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.925 "strip_size_kb": 64, 00:13:26.925 "state": "configuring", 00:13:26.925 "raid_level": "raid0", 00:13:26.925 "superblock": false, 00:13:26.925 "num_base_bdevs": 3, 00:13:26.925 "num_base_bdevs_discovered": 2, 00:13:26.925 "num_base_bdevs_operational": 3, 00:13:26.925 "base_bdevs_list": [ 00:13:26.925 { 00:13:26.925 "name": "BaseBdev1", 00:13:26.925 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:26.925 "is_configured": true, 00:13:26.925 "data_offset": 0, 00:13:26.925 "data_size": 65536 00:13:26.925 }, 00:13:26.925 { 00:13:26.925 "name": null, 00:13:26.925 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:26.925 "is_configured": false, 00:13:26.925 "data_offset": 0, 00:13:26.925 "data_size": 65536 00:13:26.925 }, 00:13:26.925 { 00:13:26.925 "name": "BaseBdev3", 00:13:26.925 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:26.925 "is_configured": true, 00:13:26.925 "data_offset": 0, 00:13:26.925 "data_size": 65536 00:13:26.925 } 00:13:26.925 ] 00:13:26.925 }' 00:13:26.925 10:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.926 10:40:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.489 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.489 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:27.746 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:27.746 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:28.004 [2024-07-12 10:40:02.965541] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.004 10:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.260 10:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.260 "name": "Existed_Raid", 00:13:28.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.260 "strip_size_kb": 64, 00:13:28.260 "state": "configuring", 00:13:28.260 "raid_level": "raid0", 00:13:28.260 "superblock": false, 00:13:28.260 "num_base_bdevs": 3, 00:13:28.260 "num_base_bdevs_discovered": 1, 00:13:28.260 "num_base_bdevs_operational": 3, 00:13:28.260 "base_bdevs_list": [ 00:13:28.260 { 00:13:28.260 "name": null, 00:13:28.260 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:28.260 "is_configured": false, 00:13:28.260 "data_offset": 0, 00:13:28.260 "data_size": 65536 00:13:28.260 }, 00:13:28.260 { 00:13:28.260 "name": null, 00:13:28.260 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:28.260 "is_configured": false, 00:13:28.260 "data_offset": 0, 00:13:28.260 "data_size": 65536 00:13:28.260 }, 00:13:28.260 { 00:13:28.260 "name": "BaseBdev3", 00:13:28.261 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:28.261 "is_configured": true, 00:13:28.261 "data_offset": 0, 00:13:28.261 "data_size": 65536 00:13:28.261 } 00:13:28.261 ] 00:13:28.261 }' 00:13:28.261 10:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.261 10:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.834 10:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.834 10:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:28.834 10:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:28.834 10:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:29.091 [2024-07-12 10:40:04.205313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.091 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:29.348 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.348 "name": "Existed_Raid", 00:13:29.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:29.348 "strip_size_kb": 64, 00:13:29.348 "state": "configuring", 00:13:29.348 "raid_level": "raid0", 00:13:29.348 "superblock": false, 00:13:29.348 "num_base_bdevs": 3, 00:13:29.348 "num_base_bdevs_discovered": 2, 00:13:29.348 "num_base_bdevs_operational": 3, 00:13:29.348 "base_bdevs_list": [ 00:13:29.348 { 00:13:29.348 "name": null, 00:13:29.348 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:29.348 "is_configured": false, 00:13:29.348 "data_offset": 0, 00:13:29.348 "data_size": 65536 00:13:29.348 }, 00:13:29.348 { 00:13:29.348 "name": "BaseBdev2", 00:13:29.348 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:29.348 "is_configured": true, 00:13:29.348 "data_offset": 0, 00:13:29.348 "data_size": 65536 00:13:29.348 }, 00:13:29.348 { 00:13:29.348 "name": "BaseBdev3", 00:13:29.348 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:29.348 "is_configured": true, 00:13:29.348 "data_offset": 0, 00:13:29.348 "data_size": 65536 00:13:29.348 } 00:13:29.348 ] 00:13:29.348 }' 00:13:29.348 10:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.348 10:40:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.913 10:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.913 10:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:30.171 10:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:30.171 10:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.171 10:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:30.429 10:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 49d11b7f-fa79-489a-977b-3c076a6d8deb 00:13:30.686 [2024-07-12 10:40:05.814438] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:30.686 [2024-07-12 10:40:05.814474] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8e2450 00:13:30.686 [2024-07-12 10:40:05.814494] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:30.686 [2024-07-12 10:40:05.814686] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8e3a50 00:13:30.686 [2024-07-12 10:40:05.814800] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8e2450 00:13:30.686 [2024-07-12 10:40:05.814810] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8e2450 00:13:30.686 [2024-07-12 10:40:05.814968] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:30.686 NewBaseBdev 00:13:30.686 10:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:30.686 10:40:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:30.686 10:40:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:30.686 10:40:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:30.686 10:40:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:30.686 10:40:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:30.686 10:40:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:30.943 10:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:31.201 [ 00:13:31.201 { 00:13:31.201 "name": "NewBaseBdev", 00:13:31.201 "aliases": [ 00:13:31.201 "49d11b7f-fa79-489a-977b-3c076a6d8deb" 00:13:31.201 ], 00:13:31.201 "product_name": "Malloc disk", 00:13:31.201 "block_size": 512, 00:13:31.201 "num_blocks": 65536, 00:13:31.201 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:31.201 "assigned_rate_limits": { 00:13:31.201 "rw_ios_per_sec": 0, 00:13:31.201 "rw_mbytes_per_sec": 0, 00:13:31.201 "r_mbytes_per_sec": 0, 00:13:31.201 "w_mbytes_per_sec": 0 00:13:31.201 }, 00:13:31.201 "claimed": true, 00:13:31.201 "claim_type": "exclusive_write", 00:13:31.201 "zoned": false, 00:13:31.201 "supported_io_types": { 00:13:31.201 "read": true, 00:13:31.201 "write": true, 00:13:31.201 "unmap": true, 00:13:31.201 "flush": true, 00:13:31.201 "reset": true, 00:13:31.201 "nvme_admin": false, 00:13:31.201 "nvme_io": false, 00:13:31.201 "nvme_io_md": false, 00:13:31.201 "write_zeroes": true, 00:13:31.201 "zcopy": true, 00:13:31.201 "get_zone_info": false, 00:13:31.201 "zone_management": false, 00:13:31.201 "zone_append": false, 00:13:31.201 "compare": false, 00:13:31.201 "compare_and_write": false, 00:13:31.201 "abort": true, 00:13:31.201 "seek_hole": false, 00:13:31.201 "seek_data": false, 00:13:31.201 "copy": true, 00:13:31.201 "nvme_iov_md": false 00:13:31.201 }, 00:13:31.201 "memory_domains": [ 00:13:31.201 { 00:13:31.201 "dma_device_id": "system", 00:13:31.201 "dma_device_type": 1 00:13:31.202 }, 00:13:31.202 { 00:13:31.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.202 "dma_device_type": 2 00:13:31.202 } 00:13:31.202 ], 00:13:31.202 "driver_specific": {} 00:13:31.202 } 00:13:31.202 ] 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.202 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.459 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.459 "name": "Existed_Raid", 00:13:31.459 "uuid": "cb97bee6-38db-4ac4-b1cf-c8a20ce0bf01", 00:13:31.459 "strip_size_kb": 64, 00:13:31.459 "state": "online", 00:13:31.459 "raid_level": "raid0", 00:13:31.459 "superblock": false, 00:13:31.459 "num_base_bdevs": 3, 00:13:31.459 "num_base_bdevs_discovered": 3, 00:13:31.459 "num_base_bdevs_operational": 3, 00:13:31.459 "base_bdevs_list": [ 00:13:31.459 { 00:13:31.459 "name": "NewBaseBdev", 00:13:31.459 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:31.459 "is_configured": true, 00:13:31.459 "data_offset": 0, 00:13:31.459 "data_size": 65536 00:13:31.459 }, 00:13:31.459 { 00:13:31.459 "name": "BaseBdev2", 00:13:31.459 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:31.459 "is_configured": true, 00:13:31.459 "data_offset": 0, 00:13:31.460 "data_size": 65536 00:13:31.460 }, 00:13:31.460 { 00:13:31.460 "name": "BaseBdev3", 00:13:31.460 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:31.460 "is_configured": true, 00:13:31.460 "data_offset": 0, 00:13:31.460 "data_size": 65536 00:13:31.460 } 00:13:31.460 ] 00:13:31.460 }' 00:13:31.460 10:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.460 10:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.023 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:32.023 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:32.023 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:32.023 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:32.023 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:32.024 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:32.024 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:32.024 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:32.281 [2024-07-12 10:40:07.362831] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:32.281 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:32.281 "name": "Existed_Raid", 00:13:32.281 "aliases": [ 00:13:32.281 "cb97bee6-38db-4ac4-b1cf-c8a20ce0bf01" 00:13:32.281 ], 00:13:32.281 "product_name": "Raid Volume", 00:13:32.281 "block_size": 512, 00:13:32.281 "num_blocks": 196608, 00:13:32.281 "uuid": "cb97bee6-38db-4ac4-b1cf-c8a20ce0bf01", 00:13:32.281 "assigned_rate_limits": { 00:13:32.281 "rw_ios_per_sec": 0, 00:13:32.281 "rw_mbytes_per_sec": 0, 00:13:32.281 "r_mbytes_per_sec": 0, 00:13:32.281 "w_mbytes_per_sec": 0 00:13:32.281 }, 00:13:32.281 "claimed": false, 00:13:32.281 "zoned": false, 00:13:32.281 "supported_io_types": { 00:13:32.281 "read": true, 00:13:32.281 "write": true, 00:13:32.281 "unmap": true, 00:13:32.281 "flush": true, 00:13:32.281 "reset": true, 00:13:32.281 "nvme_admin": false, 00:13:32.281 "nvme_io": false, 00:13:32.281 "nvme_io_md": false, 00:13:32.281 "write_zeroes": true, 00:13:32.281 "zcopy": false, 00:13:32.281 "get_zone_info": false, 00:13:32.281 "zone_management": false, 00:13:32.281 "zone_append": false, 00:13:32.281 "compare": false, 00:13:32.281 "compare_and_write": false, 00:13:32.281 "abort": false, 00:13:32.281 "seek_hole": false, 00:13:32.281 "seek_data": false, 00:13:32.281 "copy": false, 00:13:32.281 "nvme_iov_md": false 00:13:32.281 }, 00:13:32.281 "memory_domains": [ 00:13:32.281 { 00:13:32.281 "dma_device_id": "system", 00:13:32.281 "dma_device_type": 1 00:13:32.281 }, 00:13:32.281 { 00:13:32.281 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.281 "dma_device_type": 2 00:13:32.281 }, 00:13:32.281 { 00:13:32.281 "dma_device_id": "system", 00:13:32.281 "dma_device_type": 1 00:13:32.281 }, 00:13:32.281 { 00:13:32.281 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.281 "dma_device_type": 2 00:13:32.281 }, 00:13:32.281 { 00:13:32.281 "dma_device_id": "system", 00:13:32.281 "dma_device_type": 1 00:13:32.281 }, 00:13:32.281 { 00:13:32.281 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.281 "dma_device_type": 2 00:13:32.281 } 00:13:32.281 ], 00:13:32.281 "driver_specific": { 00:13:32.281 "raid": { 00:13:32.281 "uuid": "cb97bee6-38db-4ac4-b1cf-c8a20ce0bf01", 00:13:32.281 "strip_size_kb": 64, 00:13:32.281 "state": "online", 00:13:32.281 "raid_level": "raid0", 00:13:32.281 "superblock": false, 00:13:32.281 "num_base_bdevs": 3, 00:13:32.281 "num_base_bdevs_discovered": 3, 00:13:32.281 "num_base_bdevs_operational": 3, 00:13:32.281 "base_bdevs_list": [ 00:13:32.281 { 00:13:32.281 "name": "NewBaseBdev", 00:13:32.281 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:32.281 "is_configured": true, 00:13:32.281 "data_offset": 0, 00:13:32.281 "data_size": 65536 00:13:32.281 }, 00:13:32.281 { 00:13:32.281 "name": "BaseBdev2", 00:13:32.281 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:32.281 "is_configured": true, 00:13:32.281 "data_offset": 0, 00:13:32.281 "data_size": 65536 00:13:32.281 }, 00:13:32.281 { 00:13:32.281 "name": "BaseBdev3", 00:13:32.281 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:32.281 "is_configured": true, 00:13:32.281 "data_offset": 0, 00:13:32.281 "data_size": 65536 00:13:32.281 } 00:13:32.281 ] 00:13:32.281 } 00:13:32.281 } 00:13:32.281 }' 00:13:32.281 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:32.281 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:32.281 BaseBdev2 00:13:32.281 BaseBdev3' 00:13:32.281 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:32.281 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:32.281 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.539 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.539 "name": "NewBaseBdev", 00:13:32.539 "aliases": [ 00:13:32.539 "49d11b7f-fa79-489a-977b-3c076a6d8deb" 00:13:32.539 ], 00:13:32.539 "product_name": "Malloc disk", 00:13:32.539 "block_size": 512, 00:13:32.539 "num_blocks": 65536, 00:13:32.539 "uuid": "49d11b7f-fa79-489a-977b-3c076a6d8deb", 00:13:32.539 "assigned_rate_limits": { 00:13:32.539 "rw_ios_per_sec": 0, 00:13:32.539 "rw_mbytes_per_sec": 0, 00:13:32.539 "r_mbytes_per_sec": 0, 00:13:32.539 "w_mbytes_per_sec": 0 00:13:32.539 }, 00:13:32.539 "claimed": true, 00:13:32.539 "claim_type": "exclusive_write", 00:13:32.539 "zoned": false, 00:13:32.539 "supported_io_types": { 00:13:32.539 "read": true, 00:13:32.539 "write": true, 00:13:32.539 "unmap": true, 00:13:32.539 "flush": true, 00:13:32.539 "reset": true, 00:13:32.539 "nvme_admin": false, 00:13:32.539 "nvme_io": false, 00:13:32.539 "nvme_io_md": false, 00:13:32.539 "write_zeroes": true, 00:13:32.539 "zcopy": true, 00:13:32.539 "get_zone_info": false, 00:13:32.539 "zone_management": false, 00:13:32.539 "zone_append": false, 00:13:32.539 "compare": false, 00:13:32.539 "compare_and_write": false, 00:13:32.539 "abort": true, 00:13:32.539 "seek_hole": false, 00:13:32.539 "seek_data": false, 00:13:32.539 "copy": true, 00:13:32.539 "nvme_iov_md": false 00:13:32.539 }, 00:13:32.539 "memory_domains": [ 00:13:32.539 { 00:13:32.539 "dma_device_id": "system", 00:13:32.539 "dma_device_type": 1 00:13:32.539 }, 00:13:32.539 { 00:13:32.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.539 "dma_device_type": 2 00:13:32.539 } 00:13:32.539 ], 00:13:32.539 "driver_specific": {} 00:13:32.539 }' 00:13:32.539 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.539 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.803 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:32.803 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.803 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.804 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:32.804 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.804 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.804 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.804 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.804 10:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.061 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:33.061 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:33.061 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:33.061 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:33.319 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:33.319 "name": "BaseBdev2", 00:13:33.319 "aliases": [ 00:13:33.319 "97e1b141-067f-4e47-a3a3-3e087bb5f759" 00:13:33.319 ], 00:13:33.319 "product_name": "Malloc disk", 00:13:33.319 "block_size": 512, 00:13:33.319 "num_blocks": 65536, 00:13:33.319 "uuid": "97e1b141-067f-4e47-a3a3-3e087bb5f759", 00:13:33.319 "assigned_rate_limits": { 00:13:33.319 "rw_ios_per_sec": 0, 00:13:33.319 "rw_mbytes_per_sec": 0, 00:13:33.319 "r_mbytes_per_sec": 0, 00:13:33.319 "w_mbytes_per_sec": 0 00:13:33.319 }, 00:13:33.319 "claimed": true, 00:13:33.319 "claim_type": "exclusive_write", 00:13:33.319 "zoned": false, 00:13:33.319 "supported_io_types": { 00:13:33.319 "read": true, 00:13:33.319 "write": true, 00:13:33.319 "unmap": true, 00:13:33.319 "flush": true, 00:13:33.319 "reset": true, 00:13:33.319 "nvme_admin": false, 00:13:33.319 "nvme_io": false, 00:13:33.319 "nvme_io_md": false, 00:13:33.319 "write_zeroes": true, 00:13:33.319 "zcopy": true, 00:13:33.319 "get_zone_info": false, 00:13:33.319 "zone_management": false, 00:13:33.319 "zone_append": false, 00:13:33.319 "compare": false, 00:13:33.319 "compare_and_write": false, 00:13:33.319 "abort": true, 00:13:33.319 "seek_hole": false, 00:13:33.319 "seek_data": false, 00:13:33.319 "copy": true, 00:13:33.319 "nvme_iov_md": false 00:13:33.319 }, 00:13:33.319 "memory_domains": [ 00:13:33.319 { 00:13:33.319 "dma_device_id": "system", 00:13:33.319 "dma_device_type": 1 00:13:33.319 }, 00:13:33.319 { 00:13:33.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.319 "dma_device_type": 2 00:13:33.319 } 00:13:33.319 ], 00:13:33.319 "driver_specific": {} 00:13:33.319 }' 00:13:33.319 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.319 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.319 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:33.319 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:33.319 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:33.319 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:33.319 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:33.319 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:33.578 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:33.578 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.578 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.578 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:33.578 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:33.578 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:33.578 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:33.836 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:33.836 "name": "BaseBdev3", 00:13:33.836 "aliases": [ 00:13:33.836 "1df8cc08-66f7-4c80-9dbb-e56e46080e22" 00:13:33.836 ], 00:13:33.836 "product_name": "Malloc disk", 00:13:33.836 "block_size": 512, 00:13:33.836 "num_blocks": 65536, 00:13:33.836 "uuid": "1df8cc08-66f7-4c80-9dbb-e56e46080e22", 00:13:33.836 "assigned_rate_limits": { 00:13:33.836 "rw_ios_per_sec": 0, 00:13:33.836 "rw_mbytes_per_sec": 0, 00:13:33.836 "r_mbytes_per_sec": 0, 00:13:33.836 "w_mbytes_per_sec": 0 00:13:33.836 }, 00:13:33.836 "claimed": true, 00:13:33.836 "claim_type": "exclusive_write", 00:13:33.836 "zoned": false, 00:13:33.836 "supported_io_types": { 00:13:33.836 "read": true, 00:13:33.836 "write": true, 00:13:33.836 "unmap": true, 00:13:33.836 "flush": true, 00:13:33.836 "reset": true, 00:13:33.836 "nvme_admin": false, 00:13:33.836 "nvme_io": false, 00:13:33.836 "nvme_io_md": false, 00:13:33.836 "write_zeroes": true, 00:13:33.836 "zcopy": true, 00:13:33.836 "get_zone_info": false, 00:13:33.836 "zone_management": false, 00:13:33.836 "zone_append": false, 00:13:33.836 "compare": false, 00:13:33.836 "compare_and_write": false, 00:13:33.836 "abort": true, 00:13:33.836 "seek_hole": false, 00:13:33.836 "seek_data": false, 00:13:33.836 "copy": true, 00:13:33.836 "nvme_iov_md": false 00:13:33.836 }, 00:13:33.836 "memory_domains": [ 00:13:33.836 { 00:13:33.836 "dma_device_id": "system", 00:13:33.836 "dma_device_type": 1 00:13:33.836 }, 00:13:33.836 { 00:13:33.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.836 "dma_device_type": 2 00:13:33.836 } 00:13:33.836 ], 00:13:33.836 "driver_specific": {} 00:13:33.836 }' 00:13:33.836 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.836 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.836 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:33.836 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:33.836 10:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:34.094 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:34.094 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:34.094 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:34.094 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:34.094 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:34.094 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:34.094 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:34.095 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:34.353 [2024-07-12 10:40:09.440038] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:34.353 [2024-07-12 10:40:09.440063] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:34.353 [2024-07-12 10:40:09.440111] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:34.353 [2024-07-12 10:40:09.440159] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:34.353 [2024-07-12 10:40:09.440170] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8e2450 name Existed_Raid, state offline 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2035696 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2035696 ']' 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2035696 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2035696 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2035696' 00:13:34.353 killing process with pid 2035696 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2035696 00:13:34.353 [2024-07-12 10:40:09.507972] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:34.353 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2035696 00:13:34.353 [2024-07-12 10:40:09.536664] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:34.612 10:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:34.612 00:13:34.612 real 0m28.248s 00:13:34.612 user 0m51.780s 00:13:34.612 sys 0m5.101s 00:13:34.612 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:34.612 10:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.612 ************************************ 00:13:34.612 END TEST raid_state_function_test 00:13:34.612 ************************************ 00:13:34.871 10:40:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:34.871 10:40:09 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:13:34.871 10:40:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:34.871 10:40:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:34.871 10:40:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:34.871 ************************************ 00:13:34.871 START TEST raid_state_function_test_sb 00:13:34.871 ************************************ 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2040000 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2040000' 00:13:34.871 Process raid pid: 2040000 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2040000 /var/tmp/spdk-raid.sock 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2040000 ']' 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:34.871 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:34.871 10:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:34.871 [2024-07-12 10:40:09.922099] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:13:34.871 [2024-07-12 10:40:09.922166] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:34.871 [2024-07-12 10:40:10.054698] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:35.129 [2024-07-12 10:40:10.158470] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:35.129 [2024-07-12 10:40:10.226617] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:35.129 [2024-07-12 10:40:10.226652] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:35.695 10:40:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:35.695 10:40:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:35.695 10:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:35.953 [2024-07-12 10:40:10.993784] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:35.953 [2024-07-12 10:40:10.993825] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:35.953 [2024-07-12 10:40:10.993836] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:35.953 [2024-07-12 10:40:10.993847] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:35.953 [2024-07-12 10:40:10.993856] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:35.953 [2024-07-12 10:40:10.993867] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.953 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:36.211 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.211 "name": "Existed_Raid", 00:13:36.211 "uuid": "6e924d6b-1176-4532-bd94-68d21bd63112", 00:13:36.211 "strip_size_kb": 64, 00:13:36.211 "state": "configuring", 00:13:36.211 "raid_level": "raid0", 00:13:36.211 "superblock": true, 00:13:36.211 "num_base_bdevs": 3, 00:13:36.211 "num_base_bdevs_discovered": 0, 00:13:36.211 "num_base_bdevs_operational": 3, 00:13:36.211 "base_bdevs_list": [ 00:13:36.211 { 00:13:36.211 "name": "BaseBdev1", 00:13:36.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.211 "is_configured": false, 00:13:36.211 "data_offset": 0, 00:13:36.211 "data_size": 0 00:13:36.211 }, 00:13:36.211 { 00:13:36.211 "name": "BaseBdev2", 00:13:36.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.211 "is_configured": false, 00:13:36.211 "data_offset": 0, 00:13:36.211 "data_size": 0 00:13:36.211 }, 00:13:36.211 { 00:13:36.211 "name": "BaseBdev3", 00:13:36.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.211 "is_configured": false, 00:13:36.211 "data_offset": 0, 00:13:36.211 "data_size": 0 00:13:36.211 } 00:13:36.211 ] 00:13:36.211 }' 00:13:36.211 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.211 10:40:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:36.777 10:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:37.035 [2024-07-12 10:40:12.036404] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:37.035 [2024-07-12 10:40:12.036435] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206ca80 name Existed_Raid, state configuring 00:13:37.035 10:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:37.293 [2024-07-12 10:40:12.281084] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:37.293 [2024-07-12 10:40:12.281114] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:37.293 [2024-07-12 10:40:12.281128] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:37.293 [2024-07-12 10:40:12.281140] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:37.293 [2024-07-12 10:40:12.281149] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:37.293 [2024-07-12 10:40:12.281160] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:37.293 10:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:37.550 [2024-07-12 10:40:12.535623] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:37.550 BaseBdev1 00:13:37.550 10:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:37.550 10:40:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:37.551 10:40:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:37.551 10:40:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:37.551 10:40:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:37.551 10:40:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:37.551 10:40:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:37.810 10:40:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:38.083 [ 00:13:38.083 { 00:13:38.083 "name": "BaseBdev1", 00:13:38.083 "aliases": [ 00:13:38.083 "9d342da9-0888-44e1-b6a2-27d654552837" 00:13:38.083 ], 00:13:38.083 "product_name": "Malloc disk", 00:13:38.083 "block_size": 512, 00:13:38.083 "num_blocks": 65536, 00:13:38.083 "uuid": "9d342da9-0888-44e1-b6a2-27d654552837", 00:13:38.083 "assigned_rate_limits": { 00:13:38.083 "rw_ios_per_sec": 0, 00:13:38.083 "rw_mbytes_per_sec": 0, 00:13:38.083 "r_mbytes_per_sec": 0, 00:13:38.083 "w_mbytes_per_sec": 0 00:13:38.083 }, 00:13:38.083 "claimed": true, 00:13:38.083 "claim_type": "exclusive_write", 00:13:38.083 "zoned": false, 00:13:38.083 "supported_io_types": { 00:13:38.083 "read": true, 00:13:38.083 "write": true, 00:13:38.083 "unmap": true, 00:13:38.083 "flush": true, 00:13:38.083 "reset": true, 00:13:38.083 "nvme_admin": false, 00:13:38.083 "nvme_io": false, 00:13:38.083 "nvme_io_md": false, 00:13:38.083 "write_zeroes": true, 00:13:38.083 "zcopy": true, 00:13:38.083 "get_zone_info": false, 00:13:38.083 "zone_management": false, 00:13:38.083 "zone_append": false, 00:13:38.083 "compare": false, 00:13:38.083 "compare_and_write": false, 00:13:38.083 "abort": true, 00:13:38.083 "seek_hole": false, 00:13:38.083 "seek_data": false, 00:13:38.083 "copy": true, 00:13:38.083 "nvme_iov_md": false 00:13:38.083 }, 00:13:38.083 "memory_domains": [ 00:13:38.083 { 00:13:38.083 "dma_device_id": "system", 00:13:38.083 "dma_device_type": 1 00:13:38.083 }, 00:13:38.083 { 00:13:38.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.083 "dma_device_type": 2 00:13:38.083 } 00:13:38.083 ], 00:13:38.083 "driver_specific": {} 00:13:38.083 } 00:13:38.083 ] 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.083 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:38.356 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.356 "name": "Existed_Raid", 00:13:38.356 "uuid": "a4eb861b-ef94-49a8-9488-ad70daf1745c", 00:13:38.356 "strip_size_kb": 64, 00:13:38.356 "state": "configuring", 00:13:38.356 "raid_level": "raid0", 00:13:38.356 "superblock": true, 00:13:38.356 "num_base_bdevs": 3, 00:13:38.356 "num_base_bdevs_discovered": 1, 00:13:38.356 "num_base_bdevs_operational": 3, 00:13:38.356 "base_bdevs_list": [ 00:13:38.356 { 00:13:38.356 "name": "BaseBdev1", 00:13:38.356 "uuid": "9d342da9-0888-44e1-b6a2-27d654552837", 00:13:38.356 "is_configured": true, 00:13:38.356 "data_offset": 2048, 00:13:38.356 "data_size": 63488 00:13:38.356 }, 00:13:38.356 { 00:13:38.356 "name": "BaseBdev2", 00:13:38.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.356 "is_configured": false, 00:13:38.356 "data_offset": 0, 00:13:38.356 "data_size": 0 00:13:38.356 }, 00:13:38.356 { 00:13:38.356 "name": "BaseBdev3", 00:13:38.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.356 "is_configured": false, 00:13:38.356 "data_offset": 0, 00:13:38.356 "data_size": 0 00:13:38.356 } 00:13:38.356 ] 00:13:38.356 }' 00:13:38.356 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.356 10:40:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:38.920 10:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:39.178 [2024-07-12 10:40:14.156121] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:39.178 [2024-07-12 10:40:14.156161] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206c310 name Existed_Raid, state configuring 00:13:39.178 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:39.436 [2024-07-12 10:40:14.396800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:39.436 [2024-07-12 10:40:14.398284] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:39.436 [2024-07-12 10:40:14.398315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:39.436 [2024-07-12 10:40:14.398325] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:39.436 [2024-07-12 10:40:14.398336] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.436 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:39.693 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:39.693 "name": "Existed_Raid", 00:13:39.693 "uuid": "98a8ee7d-f3ef-4de5-ac00-42bbaf1838eb", 00:13:39.693 "strip_size_kb": 64, 00:13:39.693 "state": "configuring", 00:13:39.693 "raid_level": "raid0", 00:13:39.693 "superblock": true, 00:13:39.693 "num_base_bdevs": 3, 00:13:39.693 "num_base_bdevs_discovered": 1, 00:13:39.693 "num_base_bdevs_operational": 3, 00:13:39.693 "base_bdevs_list": [ 00:13:39.693 { 00:13:39.693 "name": "BaseBdev1", 00:13:39.693 "uuid": "9d342da9-0888-44e1-b6a2-27d654552837", 00:13:39.693 "is_configured": true, 00:13:39.693 "data_offset": 2048, 00:13:39.693 "data_size": 63488 00:13:39.693 }, 00:13:39.693 { 00:13:39.693 "name": "BaseBdev2", 00:13:39.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:39.693 "is_configured": false, 00:13:39.693 "data_offset": 0, 00:13:39.693 "data_size": 0 00:13:39.693 }, 00:13:39.693 { 00:13:39.693 "name": "BaseBdev3", 00:13:39.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:39.693 "is_configured": false, 00:13:39.693 "data_offset": 0, 00:13:39.693 "data_size": 0 00:13:39.693 } 00:13:39.693 ] 00:13:39.693 }' 00:13:39.694 10:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:39.694 10:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:40.258 10:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:40.516 [2024-07-12 10:40:15.511088] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:40.516 BaseBdev2 00:13:40.516 10:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:40.516 10:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:40.516 10:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:40.516 10:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:40.516 10:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:40.516 10:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:40.516 10:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:40.774 10:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:41.032 [ 00:13:41.032 { 00:13:41.032 "name": "BaseBdev2", 00:13:41.032 "aliases": [ 00:13:41.032 "bd76ae97-4727-46e8-9739-a3d33b01fe8d" 00:13:41.032 ], 00:13:41.032 "product_name": "Malloc disk", 00:13:41.032 "block_size": 512, 00:13:41.032 "num_blocks": 65536, 00:13:41.032 "uuid": "bd76ae97-4727-46e8-9739-a3d33b01fe8d", 00:13:41.032 "assigned_rate_limits": { 00:13:41.032 "rw_ios_per_sec": 0, 00:13:41.032 "rw_mbytes_per_sec": 0, 00:13:41.032 "r_mbytes_per_sec": 0, 00:13:41.032 "w_mbytes_per_sec": 0 00:13:41.032 }, 00:13:41.032 "claimed": true, 00:13:41.032 "claim_type": "exclusive_write", 00:13:41.032 "zoned": false, 00:13:41.032 "supported_io_types": { 00:13:41.032 "read": true, 00:13:41.032 "write": true, 00:13:41.032 "unmap": true, 00:13:41.032 "flush": true, 00:13:41.032 "reset": true, 00:13:41.032 "nvme_admin": false, 00:13:41.032 "nvme_io": false, 00:13:41.032 "nvme_io_md": false, 00:13:41.032 "write_zeroes": true, 00:13:41.032 "zcopy": true, 00:13:41.032 "get_zone_info": false, 00:13:41.032 "zone_management": false, 00:13:41.032 "zone_append": false, 00:13:41.032 "compare": false, 00:13:41.032 "compare_and_write": false, 00:13:41.032 "abort": true, 00:13:41.032 "seek_hole": false, 00:13:41.032 "seek_data": false, 00:13:41.032 "copy": true, 00:13:41.032 "nvme_iov_md": false 00:13:41.032 }, 00:13:41.032 "memory_domains": [ 00:13:41.032 { 00:13:41.032 "dma_device_id": "system", 00:13:41.032 "dma_device_type": 1 00:13:41.032 }, 00:13:41.032 { 00:13:41.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.032 "dma_device_type": 2 00:13:41.032 } 00:13:41.032 ], 00:13:41.032 "driver_specific": {} 00:13:41.032 } 00:13:41.032 ] 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.032 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:41.290 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.290 "name": "Existed_Raid", 00:13:41.290 "uuid": "98a8ee7d-f3ef-4de5-ac00-42bbaf1838eb", 00:13:41.290 "strip_size_kb": 64, 00:13:41.290 "state": "configuring", 00:13:41.290 "raid_level": "raid0", 00:13:41.290 "superblock": true, 00:13:41.290 "num_base_bdevs": 3, 00:13:41.290 "num_base_bdevs_discovered": 2, 00:13:41.290 "num_base_bdevs_operational": 3, 00:13:41.290 "base_bdevs_list": [ 00:13:41.290 { 00:13:41.290 "name": "BaseBdev1", 00:13:41.290 "uuid": "9d342da9-0888-44e1-b6a2-27d654552837", 00:13:41.290 "is_configured": true, 00:13:41.290 "data_offset": 2048, 00:13:41.290 "data_size": 63488 00:13:41.290 }, 00:13:41.290 { 00:13:41.290 "name": "BaseBdev2", 00:13:41.290 "uuid": "bd76ae97-4727-46e8-9739-a3d33b01fe8d", 00:13:41.290 "is_configured": true, 00:13:41.290 "data_offset": 2048, 00:13:41.290 "data_size": 63488 00:13:41.290 }, 00:13:41.290 { 00:13:41.290 "name": "BaseBdev3", 00:13:41.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.290 "is_configured": false, 00:13:41.290 "data_offset": 0, 00:13:41.290 "data_size": 0 00:13:41.290 } 00:13:41.290 ] 00:13:41.290 }' 00:13:41.290 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.290 10:40:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:41.856 10:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:41.856 [2024-07-12 10:40:17.026530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:41.856 [2024-07-12 10:40:17.026688] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x206d400 00:13:41.856 [2024-07-12 10:40:17.026702] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:41.856 [2024-07-12 10:40:17.026876] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x206cef0 00:13:41.856 [2024-07-12 10:40:17.026990] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x206d400 00:13:41.856 [2024-07-12 10:40:17.026999] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x206d400 00:13:41.856 [2024-07-12 10:40:17.027090] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:41.856 BaseBdev3 00:13:41.856 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:41.856 10:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:41.856 10:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:41.856 10:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:41.856 10:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:41.856 10:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:41.856 10:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:42.114 10:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:42.372 [ 00:13:42.372 { 00:13:42.372 "name": "BaseBdev3", 00:13:42.372 "aliases": [ 00:13:42.372 "d3961b8e-1cdf-483f-9cfa-2147946b81e9" 00:13:42.372 ], 00:13:42.372 "product_name": "Malloc disk", 00:13:42.372 "block_size": 512, 00:13:42.372 "num_blocks": 65536, 00:13:42.372 "uuid": "d3961b8e-1cdf-483f-9cfa-2147946b81e9", 00:13:42.372 "assigned_rate_limits": { 00:13:42.372 "rw_ios_per_sec": 0, 00:13:42.372 "rw_mbytes_per_sec": 0, 00:13:42.372 "r_mbytes_per_sec": 0, 00:13:42.372 "w_mbytes_per_sec": 0 00:13:42.372 }, 00:13:42.372 "claimed": true, 00:13:42.372 "claim_type": "exclusive_write", 00:13:42.372 "zoned": false, 00:13:42.372 "supported_io_types": { 00:13:42.372 "read": true, 00:13:42.372 "write": true, 00:13:42.372 "unmap": true, 00:13:42.372 "flush": true, 00:13:42.372 "reset": true, 00:13:42.372 "nvme_admin": false, 00:13:42.372 "nvme_io": false, 00:13:42.372 "nvme_io_md": false, 00:13:42.372 "write_zeroes": true, 00:13:42.372 "zcopy": true, 00:13:42.372 "get_zone_info": false, 00:13:42.372 "zone_management": false, 00:13:42.372 "zone_append": false, 00:13:42.372 "compare": false, 00:13:42.372 "compare_and_write": false, 00:13:42.372 "abort": true, 00:13:42.372 "seek_hole": false, 00:13:42.372 "seek_data": false, 00:13:42.372 "copy": true, 00:13:42.372 "nvme_iov_md": false 00:13:42.372 }, 00:13:42.373 "memory_domains": [ 00:13:42.373 { 00:13:42.373 "dma_device_id": "system", 00:13:42.373 "dma_device_type": 1 00:13:42.373 }, 00:13:42.373 { 00:13:42.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.373 "dma_device_type": 2 00:13:42.373 } 00:13:42.373 ], 00:13:42.373 "driver_specific": {} 00:13:42.373 } 00:13:42.373 ] 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.373 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:42.631 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.631 "name": "Existed_Raid", 00:13:42.631 "uuid": "98a8ee7d-f3ef-4de5-ac00-42bbaf1838eb", 00:13:42.631 "strip_size_kb": 64, 00:13:42.631 "state": "online", 00:13:42.631 "raid_level": "raid0", 00:13:42.631 "superblock": true, 00:13:42.631 "num_base_bdevs": 3, 00:13:42.631 "num_base_bdevs_discovered": 3, 00:13:42.631 "num_base_bdevs_operational": 3, 00:13:42.631 "base_bdevs_list": [ 00:13:42.631 { 00:13:42.631 "name": "BaseBdev1", 00:13:42.631 "uuid": "9d342da9-0888-44e1-b6a2-27d654552837", 00:13:42.631 "is_configured": true, 00:13:42.631 "data_offset": 2048, 00:13:42.631 "data_size": 63488 00:13:42.631 }, 00:13:42.631 { 00:13:42.631 "name": "BaseBdev2", 00:13:42.631 "uuid": "bd76ae97-4727-46e8-9739-a3d33b01fe8d", 00:13:42.631 "is_configured": true, 00:13:42.631 "data_offset": 2048, 00:13:42.631 "data_size": 63488 00:13:42.631 }, 00:13:42.631 { 00:13:42.631 "name": "BaseBdev3", 00:13:42.631 "uuid": "d3961b8e-1cdf-483f-9cfa-2147946b81e9", 00:13:42.631 "is_configured": true, 00:13:42.631 "data_offset": 2048, 00:13:42.631 "data_size": 63488 00:13:42.631 } 00:13:42.631 ] 00:13:42.631 }' 00:13:42.631 10:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.631 10:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:43.225 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:43.225 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:43.225 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:43.225 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:43.225 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:43.225 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:43.225 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:43.225 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:43.483 [2024-07-12 10:40:18.578950] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:43.483 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:43.483 "name": "Existed_Raid", 00:13:43.483 "aliases": [ 00:13:43.484 "98a8ee7d-f3ef-4de5-ac00-42bbaf1838eb" 00:13:43.484 ], 00:13:43.484 "product_name": "Raid Volume", 00:13:43.484 "block_size": 512, 00:13:43.484 "num_blocks": 190464, 00:13:43.484 "uuid": "98a8ee7d-f3ef-4de5-ac00-42bbaf1838eb", 00:13:43.484 "assigned_rate_limits": { 00:13:43.484 "rw_ios_per_sec": 0, 00:13:43.484 "rw_mbytes_per_sec": 0, 00:13:43.484 "r_mbytes_per_sec": 0, 00:13:43.484 "w_mbytes_per_sec": 0 00:13:43.484 }, 00:13:43.484 "claimed": false, 00:13:43.484 "zoned": false, 00:13:43.484 "supported_io_types": { 00:13:43.484 "read": true, 00:13:43.484 "write": true, 00:13:43.484 "unmap": true, 00:13:43.484 "flush": true, 00:13:43.484 "reset": true, 00:13:43.484 "nvme_admin": false, 00:13:43.484 "nvme_io": false, 00:13:43.484 "nvme_io_md": false, 00:13:43.484 "write_zeroes": true, 00:13:43.484 "zcopy": false, 00:13:43.484 "get_zone_info": false, 00:13:43.484 "zone_management": false, 00:13:43.484 "zone_append": false, 00:13:43.484 "compare": false, 00:13:43.484 "compare_and_write": false, 00:13:43.484 "abort": false, 00:13:43.484 "seek_hole": false, 00:13:43.484 "seek_data": false, 00:13:43.484 "copy": false, 00:13:43.484 "nvme_iov_md": false 00:13:43.484 }, 00:13:43.484 "memory_domains": [ 00:13:43.484 { 00:13:43.484 "dma_device_id": "system", 00:13:43.484 "dma_device_type": 1 00:13:43.484 }, 00:13:43.484 { 00:13:43.484 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.484 "dma_device_type": 2 00:13:43.484 }, 00:13:43.484 { 00:13:43.484 "dma_device_id": "system", 00:13:43.484 "dma_device_type": 1 00:13:43.484 }, 00:13:43.484 { 00:13:43.484 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.484 "dma_device_type": 2 00:13:43.484 }, 00:13:43.484 { 00:13:43.484 "dma_device_id": "system", 00:13:43.484 "dma_device_type": 1 00:13:43.484 }, 00:13:43.484 { 00:13:43.484 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.484 "dma_device_type": 2 00:13:43.484 } 00:13:43.484 ], 00:13:43.484 "driver_specific": { 00:13:43.484 "raid": { 00:13:43.484 "uuid": "98a8ee7d-f3ef-4de5-ac00-42bbaf1838eb", 00:13:43.484 "strip_size_kb": 64, 00:13:43.484 "state": "online", 00:13:43.484 "raid_level": "raid0", 00:13:43.484 "superblock": true, 00:13:43.484 "num_base_bdevs": 3, 00:13:43.484 "num_base_bdevs_discovered": 3, 00:13:43.484 "num_base_bdevs_operational": 3, 00:13:43.484 "base_bdevs_list": [ 00:13:43.484 { 00:13:43.484 "name": "BaseBdev1", 00:13:43.484 "uuid": "9d342da9-0888-44e1-b6a2-27d654552837", 00:13:43.484 "is_configured": true, 00:13:43.484 "data_offset": 2048, 00:13:43.484 "data_size": 63488 00:13:43.484 }, 00:13:43.484 { 00:13:43.484 "name": "BaseBdev2", 00:13:43.484 "uuid": "bd76ae97-4727-46e8-9739-a3d33b01fe8d", 00:13:43.484 "is_configured": true, 00:13:43.484 "data_offset": 2048, 00:13:43.484 "data_size": 63488 00:13:43.484 }, 00:13:43.484 { 00:13:43.484 "name": "BaseBdev3", 00:13:43.484 "uuid": "d3961b8e-1cdf-483f-9cfa-2147946b81e9", 00:13:43.484 "is_configured": true, 00:13:43.484 "data_offset": 2048, 00:13:43.484 "data_size": 63488 00:13:43.484 } 00:13:43.484 ] 00:13:43.484 } 00:13:43.484 } 00:13:43.484 }' 00:13:43.484 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:43.484 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:43.484 BaseBdev2 00:13:43.484 BaseBdev3' 00:13:43.484 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:43.484 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:43.484 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:43.742 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:43.742 "name": "BaseBdev1", 00:13:43.742 "aliases": [ 00:13:43.742 "9d342da9-0888-44e1-b6a2-27d654552837" 00:13:43.742 ], 00:13:43.742 "product_name": "Malloc disk", 00:13:43.742 "block_size": 512, 00:13:43.742 "num_blocks": 65536, 00:13:43.742 "uuid": "9d342da9-0888-44e1-b6a2-27d654552837", 00:13:43.742 "assigned_rate_limits": { 00:13:43.742 "rw_ios_per_sec": 0, 00:13:43.742 "rw_mbytes_per_sec": 0, 00:13:43.742 "r_mbytes_per_sec": 0, 00:13:43.742 "w_mbytes_per_sec": 0 00:13:43.742 }, 00:13:43.742 "claimed": true, 00:13:43.742 "claim_type": "exclusive_write", 00:13:43.742 "zoned": false, 00:13:43.742 "supported_io_types": { 00:13:43.742 "read": true, 00:13:43.742 "write": true, 00:13:43.742 "unmap": true, 00:13:43.742 "flush": true, 00:13:43.742 "reset": true, 00:13:43.742 "nvme_admin": false, 00:13:43.742 "nvme_io": false, 00:13:43.742 "nvme_io_md": false, 00:13:43.742 "write_zeroes": true, 00:13:43.742 "zcopy": true, 00:13:43.742 "get_zone_info": false, 00:13:43.742 "zone_management": false, 00:13:43.742 "zone_append": false, 00:13:43.742 "compare": false, 00:13:43.742 "compare_and_write": false, 00:13:43.742 "abort": true, 00:13:43.742 "seek_hole": false, 00:13:43.742 "seek_data": false, 00:13:43.742 "copy": true, 00:13:43.742 "nvme_iov_md": false 00:13:43.742 }, 00:13:43.742 "memory_domains": [ 00:13:43.742 { 00:13:43.742 "dma_device_id": "system", 00:13:43.742 "dma_device_type": 1 00:13:43.742 }, 00:13:43.742 { 00:13:43.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.742 "dma_device_type": 2 00:13:43.742 } 00:13:43.742 ], 00:13:43.742 "driver_specific": {} 00:13:43.742 }' 00:13:43.742 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:44.001 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:44.001 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:44.001 10:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:44.001 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:44.001 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:44.001 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:44.001 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:44.001 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:44.001 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:44.261 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:44.261 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:44.261 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:44.261 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:44.261 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:44.519 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:44.519 "name": "BaseBdev2", 00:13:44.519 "aliases": [ 00:13:44.519 "bd76ae97-4727-46e8-9739-a3d33b01fe8d" 00:13:44.519 ], 00:13:44.519 "product_name": "Malloc disk", 00:13:44.519 "block_size": 512, 00:13:44.519 "num_blocks": 65536, 00:13:44.519 "uuid": "bd76ae97-4727-46e8-9739-a3d33b01fe8d", 00:13:44.519 "assigned_rate_limits": { 00:13:44.519 "rw_ios_per_sec": 0, 00:13:44.519 "rw_mbytes_per_sec": 0, 00:13:44.519 "r_mbytes_per_sec": 0, 00:13:44.519 "w_mbytes_per_sec": 0 00:13:44.519 }, 00:13:44.519 "claimed": true, 00:13:44.519 "claim_type": "exclusive_write", 00:13:44.519 "zoned": false, 00:13:44.519 "supported_io_types": { 00:13:44.519 "read": true, 00:13:44.519 "write": true, 00:13:44.519 "unmap": true, 00:13:44.519 "flush": true, 00:13:44.519 "reset": true, 00:13:44.519 "nvme_admin": false, 00:13:44.519 "nvme_io": false, 00:13:44.519 "nvme_io_md": false, 00:13:44.519 "write_zeroes": true, 00:13:44.519 "zcopy": true, 00:13:44.519 "get_zone_info": false, 00:13:44.519 "zone_management": false, 00:13:44.519 "zone_append": false, 00:13:44.519 "compare": false, 00:13:44.519 "compare_and_write": false, 00:13:44.519 "abort": true, 00:13:44.519 "seek_hole": false, 00:13:44.519 "seek_data": false, 00:13:44.519 "copy": true, 00:13:44.519 "nvme_iov_md": false 00:13:44.519 }, 00:13:44.519 "memory_domains": [ 00:13:44.519 { 00:13:44.520 "dma_device_id": "system", 00:13:44.520 "dma_device_type": 1 00:13:44.520 }, 00:13:44.520 { 00:13:44.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:44.520 "dma_device_type": 2 00:13:44.520 } 00:13:44.520 ], 00:13:44.520 "driver_specific": {} 00:13:44.520 }' 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:44.520 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:44.779 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:44.779 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:44.779 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:44.779 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:44.779 10:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:45.038 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:45.038 "name": "BaseBdev3", 00:13:45.038 "aliases": [ 00:13:45.038 "d3961b8e-1cdf-483f-9cfa-2147946b81e9" 00:13:45.038 ], 00:13:45.038 "product_name": "Malloc disk", 00:13:45.038 "block_size": 512, 00:13:45.038 "num_blocks": 65536, 00:13:45.038 "uuid": "d3961b8e-1cdf-483f-9cfa-2147946b81e9", 00:13:45.038 "assigned_rate_limits": { 00:13:45.038 "rw_ios_per_sec": 0, 00:13:45.038 "rw_mbytes_per_sec": 0, 00:13:45.038 "r_mbytes_per_sec": 0, 00:13:45.038 "w_mbytes_per_sec": 0 00:13:45.038 }, 00:13:45.038 "claimed": true, 00:13:45.038 "claim_type": "exclusive_write", 00:13:45.038 "zoned": false, 00:13:45.038 "supported_io_types": { 00:13:45.038 "read": true, 00:13:45.038 "write": true, 00:13:45.038 "unmap": true, 00:13:45.038 "flush": true, 00:13:45.038 "reset": true, 00:13:45.038 "nvme_admin": false, 00:13:45.038 "nvme_io": false, 00:13:45.038 "nvme_io_md": false, 00:13:45.038 "write_zeroes": true, 00:13:45.038 "zcopy": true, 00:13:45.038 "get_zone_info": false, 00:13:45.038 "zone_management": false, 00:13:45.038 "zone_append": false, 00:13:45.038 "compare": false, 00:13:45.038 "compare_and_write": false, 00:13:45.038 "abort": true, 00:13:45.038 "seek_hole": false, 00:13:45.038 "seek_data": false, 00:13:45.038 "copy": true, 00:13:45.038 "nvme_iov_md": false 00:13:45.038 }, 00:13:45.038 "memory_domains": [ 00:13:45.038 { 00:13:45.038 "dma_device_id": "system", 00:13:45.038 "dma_device_type": 1 00:13:45.038 }, 00:13:45.038 { 00:13:45.038 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.038 "dma_device_type": 2 00:13:45.038 } 00:13:45.038 ], 00:13:45.038 "driver_specific": {} 00:13:45.038 }' 00:13:45.038 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:45.038 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:45.038 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:45.038 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:45.038 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:45.038 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:45.038 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:45.297 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:45.297 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:45.297 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:45.297 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:45.297 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:45.297 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:45.556 [2024-07-12 10:40:20.564058] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:45.556 [2024-07-12 10:40:20.564086] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:45.556 [2024-07-12 10:40:20.564126] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.556 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.815 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.815 "name": "Existed_Raid", 00:13:45.815 "uuid": "98a8ee7d-f3ef-4de5-ac00-42bbaf1838eb", 00:13:45.815 "strip_size_kb": 64, 00:13:45.815 "state": "offline", 00:13:45.815 "raid_level": "raid0", 00:13:45.815 "superblock": true, 00:13:45.815 "num_base_bdevs": 3, 00:13:45.815 "num_base_bdevs_discovered": 2, 00:13:45.815 "num_base_bdevs_operational": 2, 00:13:45.815 "base_bdevs_list": [ 00:13:45.815 { 00:13:45.815 "name": null, 00:13:45.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.815 "is_configured": false, 00:13:45.815 "data_offset": 2048, 00:13:45.815 "data_size": 63488 00:13:45.815 }, 00:13:45.815 { 00:13:45.815 "name": "BaseBdev2", 00:13:45.815 "uuid": "bd76ae97-4727-46e8-9739-a3d33b01fe8d", 00:13:45.815 "is_configured": true, 00:13:45.815 "data_offset": 2048, 00:13:45.815 "data_size": 63488 00:13:45.815 }, 00:13:45.815 { 00:13:45.815 "name": "BaseBdev3", 00:13:45.815 "uuid": "d3961b8e-1cdf-483f-9cfa-2147946b81e9", 00:13:45.815 "is_configured": true, 00:13:45.815 "data_offset": 2048, 00:13:45.815 "data_size": 63488 00:13:45.815 } 00:13:45.815 ] 00:13:45.815 }' 00:13:45.815 10:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.815 10:40:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:46.384 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:46.384 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:46.384 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:46.384 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.643 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:46.643 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:46.643 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:46.643 [2024-07-12 10:40:21.824469] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:46.902 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:46.902 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:46.902 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.902 10:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:46.902 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:46.902 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:46.902 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:47.160 [2024-07-12 10:40:22.314022] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:47.160 [2024-07-12 10:40:22.314065] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206d400 name Existed_Raid, state offline 00:13:47.160 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:47.160 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:47.160 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.160 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:47.419 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:47.419 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:47.419 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:47.419 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:47.419 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:47.419 10:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:47.985 BaseBdev2 00:13:47.985 10:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:47.985 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:47.985 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:47.985 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:47.985 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:47.985 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:47.985 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:48.243 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:48.501 [ 00:13:48.501 { 00:13:48.501 "name": "BaseBdev2", 00:13:48.501 "aliases": [ 00:13:48.501 "435eefb9-598e-405b-8041-02dce4a43b7d" 00:13:48.501 ], 00:13:48.501 "product_name": "Malloc disk", 00:13:48.501 "block_size": 512, 00:13:48.501 "num_blocks": 65536, 00:13:48.501 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:13:48.501 "assigned_rate_limits": { 00:13:48.501 "rw_ios_per_sec": 0, 00:13:48.501 "rw_mbytes_per_sec": 0, 00:13:48.501 "r_mbytes_per_sec": 0, 00:13:48.501 "w_mbytes_per_sec": 0 00:13:48.501 }, 00:13:48.501 "claimed": false, 00:13:48.501 "zoned": false, 00:13:48.501 "supported_io_types": { 00:13:48.501 "read": true, 00:13:48.501 "write": true, 00:13:48.501 "unmap": true, 00:13:48.501 "flush": true, 00:13:48.501 "reset": true, 00:13:48.501 "nvme_admin": false, 00:13:48.501 "nvme_io": false, 00:13:48.501 "nvme_io_md": false, 00:13:48.501 "write_zeroes": true, 00:13:48.501 "zcopy": true, 00:13:48.501 "get_zone_info": false, 00:13:48.501 "zone_management": false, 00:13:48.501 "zone_append": false, 00:13:48.501 "compare": false, 00:13:48.501 "compare_and_write": false, 00:13:48.501 "abort": true, 00:13:48.501 "seek_hole": false, 00:13:48.501 "seek_data": false, 00:13:48.501 "copy": true, 00:13:48.501 "nvme_iov_md": false 00:13:48.501 }, 00:13:48.501 "memory_domains": [ 00:13:48.501 { 00:13:48.501 "dma_device_id": "system", 00:13:48.501 "dma_device_type": 1 00:13:48.501 }, 00:13:48.501 { 00:13:48.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.501 "dma_device_type": 2 00:13:48.501 } 00:13:48.501 ], 00:13:48.501 "driver_specific": {} 00:13:48.501 } 00:13:48.501 ] 00:13:48.501 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:48.501 10:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:48.501 10:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:48.501 10:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:48.759 BaseBdev3 00:13:48.759 10:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:48.759 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:48.759 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:48.759 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:48.759 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:48.759 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:48.759 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:49.017 10:40:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:49.017 [ 00:13:49.017 { 00:13:49.017 "name": "BaseBdev3", 00:13:49.017 "aliases": [ 00:13:49.017 "ffd2042f-f471-47b5-8756-c7f3e5c81528" 00:13:49.017 ], 00:13:49.017 "product_name": "Malloc disk", 00:13:49.017 "block_size": 512, 00:13:49.017 "num_blocks": 65536, 00:13:49.017 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:13:49.017 "assigned_rate_limits": { 00:13:49.017 "rw_ios_per_sec": 0, 00:13:49.017 "rw_mbytes_per_sec": 0, 00:13:49.017 "r_mbytes_per_sec": 0, 00:13:49.017 "w_mbytes_per_sec": 0 00:13:49.017 }, 00:13:49.017 "claimed": false, 00:13:49.017 "zoned": false, 00:13:49.017 "supported_io_types": { 00:13:49.017 "read": true, 00:13:49.017 "write": true, 00:13:49.017 "unmap": true, 00:13:49.017 "flush": true, 00:13:49.017 "reset": true, 00:13:49.017 "nvme_admin": false, 00:13:49.017 "nvme_io": false, 00:13:49.017 "nvme_io_md": false, 00:13:49.017 "write_zeroes": true, 00:13:49.017 "zcopy": true, 00:13:49.017 "get_zone_info": false, 00:13:49.017 "zone_management": false, 00:13:49.017 "zone_append": false, 00:13:49.017 "compare": false, 00:13:49.017 "compare_and_write": false, 00:13:49.017 "abort": true, 00:13:49.017 "seek_hole": false, 00:13:49.017 "seek_data": false, 00:13:49.017 "copy": true, 00:13:49.017 "nvme_iov_md": false 00:13:49.017 }, 00:13:49.017 "memory_domains": [ 00:13:49.017 { 00:13:49.017 "dma_device_id": "system", 00:13:49.017 "dma_device_type": 1 00:13:49.017 }, 00:13:49.017 { 00:13:49.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.017 "dma_device_type": 2 00:13:49.017 } 00:13:49.017 ], 00:13:49.017 "driver_specific": {} 00:13:49.017 } 00:13:49.017 ] 00:13:49.017 10:40:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:49.017 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:49.017 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:49.017 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:49.275 [2024-07-12 10:40:24.359559] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:49.275 [2024-07-12 10:40:24.359599] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:49.276 [2024-07-12 10:40:24.359619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:49.276 [2024-07-12 10:40:24.360993] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.276 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.535 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.535 "name": "Existed_Raid", 00:13:49.535 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:13:49.535 "strip_size_kb": 64, 00:13:49.535 "state": "configuring", 00:13:49.535 "raid_level": "raid0", 00:13:49.535 "superblock": true, 00:13:49.535 "num_base_bdevs": 3, 00:13:49.535 "num_base_bdevs_discovered": 2, 00:13:49.535 "num_base_bdevs_operational": 3, 00:13:49.535 "base_bdevs_list": [ 00:13:49.535 { 00:13:49.535 "name": "BaseBdev1", 00:13:49.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.535 "is_configured": false, 00:13:49.535 "data_offset": 0, 00:13:49.535 "data_size": 0 00:13:49.535 }, 00:13:49.535 { 00:13:49.535 "name": "BaseBdev2", 00:13:49.535 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:13:49.535 "is_configured": true, 00:13:49.535 "data_offset": 2048, 00:13:49.535 "data_size": 63488 00:13:49.535 }, 00:13:49.535 { 00:13:49.535 "name": "BaseBdev3", 00:13:49.535 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:13:49.535 "is_configured": true, 00:13:49.535 "data_offset": 2048, 00:13:49.535 "data_size": 63488 00:13:49.535 } 00:13:49.535 ] 00:13:49.535 }' 00:13:49.535 10:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.535 10:40:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:50.472 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:50.731 [2024-07-12 10:40:25.707105] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.731 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.991 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.991 "name": "Existed_Raid", 00:13:50.991 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:13:50.991 "strip_size_kb": 64, 00:13:50.991 "state": "configuring", 00:13:50.991 "raid_level": "raid0", 00:13:50.991 "superblock": true, 00:13:50.991 "num_base_bdevs": 3, 00:13:50.991 "num_base_bdevs_discovered": 1, 00:13:50.991 "num_base_bdevs_operational": 3, 00:13:50.991 "base_bdevs_list": [ 00:13:50.991 { 00:13:50.991 "name": "BaseBdev1", 00:13:50.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.991 "is_configured": false, 00:13:50.991 "data_offset": 0, 00:13:50.991 "data_size": 0 00:13:50.991 }, 00:13:50.991 { 00:13:50.991 "name": null, 00:13:50.991 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:13:50.991 "is_configured": false, 00:13:50.991 "data_offset": 2048, 00:13:50.991 "data_size": 63488 00:13:50.991 }, 00:13:50.991 { 00:13:50.991 "name": "BaseBdev3", 00:13:50.991 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:13:50.991 "is_configured": true, 00:13:50.991 "data_offset": 2048, 00:13:50.991 "data_size": 63488 00:13:50.991 } 00:13:50.991 ] 00:13:50.991 }' 00:13:50.991 10:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.991 10:40:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:51.927 10:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.928 10:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:52.185 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:52.185 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:52.466 [2024-07-12 10:40:27.508450] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:52.466 BaseBdev1 00:13:52.466 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:52.466 10:40:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:52.466 10:40:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:52.466 10:40:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:52.466 10:40:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:52.466 10:40:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:52.466 10:40:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:52.729 10:40:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:52.729 [ 00:13:52.729 { 00:13:52.729 "name": "BaseBdev1", 00:13:52.729 "aliases": [ 00:13:52.729 "6d6add87-b1c4-402c-993f-eb5e7422670d" 00:13:52.729 ], 00:13:52.729 "product_name": "Malloc disk", 00:13:52.729 "block_size": 512, 00:13:52.729 "num_blocks": 65536, 00:13:52.729 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:13:52.729 "assigned_rate_limits": { 00:13:52.729 "rw_ios_per_sec": 0, 00:13:52.729 "rw_mbytes_per_sec": 0, 00:13:52.730 "r_mbytes_per_sec": 0, 00:13:52.730 "w_mbytes_per_sec": 0 00:13:52.730 }, 00:13:52.730 "claimed": true, 00:13:52.730 "claim_type": "exclusive_write", 00:13:52.730 "zoned": false, 00:13:52.730 "supported_io_types": { 00:13:52.730 "read": true, 00:13:52.730 "write": true, 00:13:52.730 "unmap": true, 00:13:52.730 "flush": true, 00:13:52.730 "reset": true, 00:13:52.730 "nvme_admin": false, 00:13:52.730 "nvme_io": false, 00:13:52.730 "nvme_io_md": false, 00:13:52.730 "write_zeroes": true, 00:13:52.730 "zcopy": true, 00:13:52.730 "get_zone_info": false, 00:13:52.730 "zone_management": false, 00:13:52.730 "zone_append": false, 00:13:52.730 "compare": false, 00:13:52.730 "compare_and_write": false, 00:13:52.730 "abort": true, 00:13:52.730 "seek_hole": false, 00:13:52.730 "seek_data": false, 00:13:52.730 "copy": true, 00:13:52.730 "nvme_iov_md": false 00:13:52.730 }, 00:13:52.730 "memory_domains": [ 00:13:52.730 { 00:13:52.730 "dma_device_id": "system", 00:13:52.730 "dma_device_type": 1 00:13:52.730 }, 00:13:52.730 { 00:13:52.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.730 "dma_device_type": 2 00:13:52.730 } 00:13:52.730 ], 00:13:52.730 "driver_specific": {} 00:13:52.730 } 00:13:52.730 ] 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.730 10:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.988 10:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.988 "name": "Existed_Raid", 00:13:52.988 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:13:52.988 "strip_size_kb": 64, 00:13:52.988 "state": "configuring", 00:13:52.988 "raid_level": "raid0", 00:13:52.988 "superblock": true, 00:13:52.988 "num_base_bdevs": 3, 00:13:52.988 "num_base_bdevs_discovered": 2, 00:13:52.988 "num_base_bdevs_operational": 3, 00:13:52.988 "base_bdevs_list": [ 00:13:52.988 { 00:13:52.988 "name": "BaseBdev1", 00:13:52.988 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:13:52.988 "is_configured": true, 00:13:52.988 "data_offset": 2048, 00:13:52.988 "data_size": 63488 00:13:52.988 }, 00:13:52.988 { 00:13:52.988 "name": null, 00:13:52.988 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:13:52.988 "is_configured": false, 00:13:52.988 "data_offset": 2048, 00:13:52.988 "data_size": 63488 00:13:52.988 }, 00:13:52.988 { 00:13:52.988 "name": "BaseBdev3", 00:13:52.988 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:13:52.988 "is_configured": true, 00:13:52.988 "data_offset": 2048, 00:13:52.988 "data_size": 63488 00:13:52.988 } 00:13:52.988 ] 00:13:52.988 }' 00:13:52.988 10:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.988 10:40:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:53.921 10:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.921 10:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:53.921 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:53.921 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:54.488 [2024-07-12 10:40:29.489747] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.488 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:54.746 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:54.746 "name": "Existed_Raid", 00:13:54.746 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:13:54.746 "strip_size_kb": 64, 00:13:54.746 "state": "configuring", 00:13:54.746 "raid_level": "raid0", 00:13:54.746 "superblock": true, 00:13:54.746 "num_base_bdevs": 3, 00:13:54.746 "num_base_bdevs_discovered": 1, 00:13:54.746 "num_base_bdevs_operational": 3, 00:13:54.746 "base_bdevs_list": [ 00:13:54.746 { 00:13:54.746 "name": "BaseBdev1", 00:13:54.746 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:13:54.746 "is_configured": true, 00:13:54.746 "data_offset": 2048, 00:13:54.746 "data_size": 63488 00:13:54.746 }, 00:13:54.746 { 00:13:54.746 "name": null, 00:13:54.746 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:13:54.746 "is_configured": false, 00:13:54.746 "data_offset": 2048, 00:13:54.746 "data_size": 63488 00:13:54.746 }, 00:13:54.746 { 00:13:54.746 "name": null, 00:13:54.746 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:13:54.746 "is_configured": false, 00:13:54.746 "data_offset": 2048, 00:13:54.746 "data_size": 63488 00:13:54.746 } 00:13:54.746 ] 00:13:54.746 }' 00:13:54.746 10:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:54.746 10:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:55.311 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.311 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:55.569 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:55.569 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:55.827 [2024-07-12 10:40:30.853384] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.827 10:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.086 10:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.086 "name": "Existed_Raid", 00:13:56.086 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:13:56.086 "strip_size_kb": 64, 00:13:56.086 "state": "configuring", 00:13:56.086 "raid_level": "raid0", 00:13:56.086 "superblock": true, 00:13:56.086 "num_base_bdevs": 3, 00:13:56.086 "num_base_bdevs_discovered": 2, 00:13:56.086 "num_base_bdevs_operational": 3, 00:13:56.086 "base_bdevs_list": [ 00:13:56.086 { 00:13:56.086 "name": "BaseBdev1", 00:13:56.086 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:13:56.086 "is_configured": true, 00:13:56.086 "data_offset": 2048, 00:13:56.086 "data_size": 63488 00:13:56.086 }, 00:13:56.086 { 00:13:56.086 "name": null, 00:13:56.086 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:13:56.086 "is_configured": false, 00:13:56.086 "data_offset": 2048, 00:13:56.086 "data_size": 63488 00:13:56.086 }, 00:13:56.086 { 00:13:56.086 "name": "BaseBdev3", 00:13:56.086 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:13:56.086 "is_configured": true, 00:13:56.086 "data_offset": 2048, 00:13:56.086 "data_size": 63488 00:13:56.086 } 00:13:56.086 ] 00:13:56.086 }' 00:13:56.086 10:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.086 10:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:56.652 10:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.652 10:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:56.911 10:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:56.911 10:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:57.169 [2024-07-12 10:40:32.204980] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.169 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.427 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.427 "name": "Existed_Raid", 00:13:57.427 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:13:57.427 "strip_size_kb": 64, 00:13:57.427 "state": "configuring", 00:13:57.427 "raid_level": "raid0", 00:13:57.427 "superblock": true, 00:13:57.427 "num_base_bdevs": 3, 00:13:57.427 "num_base_bdevs_discovered": 1, 00:13:57.427 "num_base_bdevs_operational": 3, 00:13:57.427 "base_bdevs_list": [ 00:13:57.427 { 00:13:57.427 "name": null, 00:13:57.427 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:13:57.427 "is_configured": false, 00:13:57.427 "data_offset": 2048, 00:13:57.427 "data_size": 63488 00:13:57.427 }, 00:13:57.427 { 00:13:57.427 "name": null, 00:13:57.427 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:13:57.427 "is_configured": false, 00:13:57.427 "data_offset": 2048, 00:13:57.427 "data_size": 63488 00:13:57.427 }, 00:13:57.427 { 00:13:57.427 "name": "BaseBdev3", 00:13:57.427 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:13:57.427 "is_configured": true, 00:13:57.427 "data_offset": 2048, 00:13:57.427 "data_size": 63488 00:13:57.427 } 00:13:57.427 ] 00:13:57.427 }' 00:13:57.427 10:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.427 10:40:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:58.004 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.004 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:58.266 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:58.266 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:58.524 [2024-07-12 10:40:33.564779] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.524 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.782 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.782 "name": "Existed_Raid", 00:13:58.782 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:13:58.782 "strip_size_kb": 64, 00:13:58.782 "state": "configuring", 00:13:58.782 "raid_level": "raid0", 00:13:58.782 "superblock": true, 00:13:58.782 "num_base_bdevs": 3, 00:13:58.782 "num_base_bdevs_discovered": 2, 00:13:58.782 "num_base_bdevs_operational": 3, 00:13:58.782 "base_bdevs_list": [ 00:13:58.782 { 00:13:58.782 "name": null, 00:13:58.782 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:13:58.782 "is_configured": false, 00:13:58.782 "data_offset": 2048, 00:13:58.782 "data_size": 63488 00:13:58.782 }, 00:13:58.782 { 00:13:58.782 "name": "BaseBdev2", 00:13:58.782 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:13:58.782 "is_configured": true, 00:13:58.782 "data_offset": 2048, 00:13:58.782 "data_size": 63488 00:13:58.782 }, 00:13:58.782 { 00:13:58.782 "name": "BaseBdev3", 00:13:58.782 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:13:58.782 "is_configured": true, 00:13:58.782 "data_offset": 2048, 00:13:58.782 "data_size": 63488 00:13:58.782 } 00:13:58.782 ] 00:13:58.782 }' 00:13:58.782 10:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.782 10:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.347 10:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.347 10:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:59.604 10:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:59.604 10:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.604 10:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:59.862 10:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6d6add87-b1c4-402c-993f-eb5e7422670d 00:14:00.120 [2024-07-12 10:40:35.113446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:00.120 [2024-07-12 10:40:35.113602] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x206be90 00:14:00.120 [2024-07-12 10:40:35.113616] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:00.120 [2024-07-12 10:40:35.113791] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d72940 00:14:00.120 [2024-07-12 10:40:35.113903] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x206be90 00:14:00.120 [2024-07-12 10:40:35.113913] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x206be90 00:14:00.120 [2024-07-12 10:40:35.114001] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:00.120 NewBaseBdev 00:14:00.120 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:00.120 10:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:00.120 10:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:00.120 10:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:00.120 10:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:00.120 10:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:00.120 10:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:00.377 10:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:00.635 [ 00:14:00.635 { 00:14:00.635 "name": "NewBaseBdev", 00:14:00.635 "aliases": [ 00:14:00.635 "6d6add87-b1c4-402c-993f-eb5e7422670d" 00:14:00.635 ], 00:14:00.635 "product_name": "Malloc disk", 00:14:00.635 "block_size": 512, 00:14:00.635 "num_blocks": 65536, 00:14:00.635 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:14:00.635 "assigned_rate_limits": { 00:14:00.635 "rw_ios_per_sec": 0, 00:14:00.635 "rw_mbytes_per_sec": 0, 00:14:00.635 "r_mbytes_per_sec": 0, 00:14:00.635 "w_mbytes_per_sec": 0 00:14:00.635 }, 00:14:00.635 "claimed": true, 00:14:00.635 "claim_type": "exclusive_write", 00:14:00.635 "zoned": false, 00:14:00.635 "supported_io_types": { 00:14:00.635 "read": true, 00:14:00.635 "write": true, 00:14:00.635 "unmap": true, 00:14:00.635 "flush": true, 00:14:00.635 "reset": true, 00:14:00.635 "nvme_admin": false, 00:14:00.635 "nvme_io": false, 00:14:00.635 "nvme_io_md": false, 00:14:00.635 "write_zeroes": true, 00:14:00.635 "zcopy": true, 00:14:00.635 "get_zone_info": false, 00:14:00.635 "zone_management": false, 00:14:00.635 "zone_append": false, 00:14:00.635 "compare": false, 00:14:00.635 "compare_and_write": false, 00:14:00.635 "abort": true, 00:14:00.635 "seek_hole": false, 00:14:00.635 "seek_data": false, 00:14:00.635 "copy": true, 00:14:00.635 "nvme_iov_md": false 00:14:00.635 }, 00:14:00.635 "memory_domains": [ 00:14:00.635 { 00:14:00.635 "dma_device_id": "system", 00:14:00.635 "dma_device_type": 1 00:14:00.635 }, 00:14:00.635 { 00:14:00.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.635 "dma_device_type": 2 00:14:00.635 } 00:14:00.635 ], 00:14:00.635 "driver_specific": {} 00:14:00.635 } 00:14:00.635 ] 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.635 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.892 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.892 "name": "Existed_Raid", 00:14:00.892 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:14:00.892 "strip_size_kb": 64, 00:14:00.892 "state": "online", 00:14:00.892 "raid_level": "raid0", 00:14:00.892 "superblock": true, 00:14:00.892 "num_base_bdevs": 3, 00:14:00.892 "num_base_bdevs_discovered": 3, 00:14:00.892 "num_base_bdevs_operational": 3, 00:14:00.892 "base_bdevs_list": [ 00:14:00.892 { 00:14:00.892 "name": "NewBaseBdev", 00:14:00.892 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:14:00.892 "is_configured": true, 00:14:00.892 "data_offset": 2048, 00:14:00.892 "data_size": 63488 00:14:00.892 }, 00:14:00.892 { 00:14:00.892 "name": "BaseBdev2", 00:14:00.892 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:14:00.892 "is_configured": true, 00:14:00.892 "data_offset": 2048, 00:14:00.892 "data_size": 63488 00:14:00.892 }, 00:14:00.892 { 00:14:00.892 "name": "BaseBdev3", 00:14:00.892 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:14:00.892 "is_configured": true, 00:14:00.892 "data_offset": 2048, 00:14:00.892 "data_size": 63488 00:14:00.892 } 00:14:00.892 ] 00:14:00.892 }' 00:14:00.892 10:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.892 10:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.456 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:01.456 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:01.456 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:01.456 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:01.456 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:01.456 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:01.456 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:01.456 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:01.713 [2024-07-12 10:40:36.685917] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:01.713 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:01.713 "name": "Existed_Raid", 00:14:01.713 "aliases": [ 00:14:01.713 "32f26903-c261-4530-b27c-3570933fbc81" 00:14:01.713 ], 00:14:01.713 "product_name": "Raid Volume", 00:14:01.713 "block_size": 512, 00:14:01.713 "num_blocks": 190464, 00:14:01.713 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:14:01.713 "assigned_rate_limits": { 00:14:01.713 "rw_ios_per_sec": 0, 00:14:01.713 "rw_mbytes_per_sec": 0, 00:14:01.713 "r_mbytes_per_sec": 0, 00:14:01.713 "w_mbytes_per_sec": 0 00:14:01.713 }, 00:14:01.713 "claimed": false, 00:14:01.713 "zoned": false, 00:14:01.713 "supported_io_types": { 00:14:01.713 "read": true, 00:14:01.713 "write": true, 00:14:01.713 "unmap": true, 00:14:01.713 "flush": true, 00:14:01.713 "reset": true, 00:14:01.713 "nvme_admin": false, 00:14:01.713 "nvme_io": false, 00:14:01.713 "nvme_io_md": false, 00:14:01.713 "write_zeroes": true, 00:14:01.713 "zcopy": false, 00:14:01.713 "get_zone_info": false, 00:14:01.713 "zone_management": false, 00:14:01.713 "zone_append": false, 00:14:01.713 "compare": false, 00:14:01.713 "compare_and_write": false, 00:14:01.713 "abort": false, 00:14:01.713 "seek_hole": false, 00:14:01.713 "seek_data": false, 00:14:01.713 "copy": false, 00:14:01.713 "nvme_iov_md": false 00:14:01.713 }, 00:14:01.713 "memory_domains": [ 00:14:01.713 { 00:14:01.713 "dma_device_id": "system", 00:14:01.713 "dma_device_type": 1 00:14:01.713 }, 00:14:01.713 { 00:14:01.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.713 "dma_device_type": 2 00:14:01.713 }, 00:14:01.713 { 00:14:01.713 "dma_device_id": "system", 00:14:01.713 "dma_device_type": 1 00:14:01.713 }, 00:14:01.713 { 00:14:01.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.713 "dma_device_type": 2 00:14:01.713 }, 00:14:01.714 { 00:14:01.714 "dma_device_id": "system", 00:14:01.714 "dma_device_type": 1 00:14:01.714 }, 00:14:01.714 { 00:14:01.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.714 "dma_device_type": 2 00:14:01.714 } 00:14:01.714 ], 00:14:01.714 "driver_specific": { 00:14:01.714 "raid": { 00:14:01.714 "uuid": "32f26903-c261-4530-b27c-3570933fbc81", 00:14:01.714 "strip_size_kb": 64, 00:14:01.714 "state": "online", 00:14:01.714 "raid_level": "raid0", 00:14:01.714 "superblock": true, 00:14:01.714 "num_base_bdevs": 3, 00:14:01.714 "num_base_bdevs_discovered": 3, 00:14:01.714 "num_base_bdevs_operational": 3, 00:14:01.714 "base_bdevs_list": [ 00:14:01.714 { 00:14:01.714 "name": "NewBaseBdev", 00:14:01.714 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:14:01.714 "is_configured": true, 00:14:01.714 "data_offset": 2048, 00:14:01.714 "data_size": 63488 00:14:01.714 }, 00:14:01.714 { 00:14:01.714 "name": "BaseBdev2", 00:14:01.714 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:14:01.714 "is_configured": true, 00:14:01.714 "data_offset": 2048, 00:14:01.714 "data_size": 63488 00:14:01.714 }, 00:14:01.714 { 00:14:01.714 "name": "BaseBdev3", 00:14:01.714 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:14:01.714 "is_configured": true, 00:14:01.714 "data_offset": 2048, 00:14:01.714 "data_size": 63488 00:14:01.714 } 00:14:01.714 ] 00:14:01.714 } 00:14:01.714 } 00:14:01.714 }' 00:14:01.714 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:01.714 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:01.714 BaseBdev2 00:14:01.714 BaseBdev3' 00:14:01.714 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:01.714 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:01.714 10:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:01.971 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:01.971 "name": "NewBaseBdev", 00:14:01.971 "aliases": [ 00:14:01.971 "6d6add87-b1c4-402c-993f-eb5e7422670d" 00:14:01.971 ], 00:14:01.971 "product_name": "Malloc disk", 00:14:01.971 "block_size": 512, 00:14:01.971 "num_blocks": 65536, 00:14:01.971 "uuid": "6d6add87-b1c4-402c-993f-eb5e7422670d", 00:14:01.971 "assigned_rate_limits": { 00:14:01.971 "rw_ios_per_sec": 0, 00:14:01.971 "rw_mbytes_per_sec": 0, 00:14:01.971 "r_mbytes_per_sec": 0, 00:14:01.971 "w_mbytes_per_sec": 0 00:14:01.971 }, 00:14:01.971 "claimed": true, 00:14:01.971 "claim_type": "exclusive_write", 00:14:01.971 "zoned": false, 00:14:01.971 "supported_io_types": { 00:14:01.971 "read": true, 00:14:01.971 "write": true, 00:14:01.971 "unmap": true, 00:14:01.971 "flush": true, 00:14:01.971 "reset": true, 00:14:01.971 "nvme_admin": false, 00:14:01.971 "nvme_io": false, 00:14:01.971 "nvme_io_md": false, 00:14:01.971 "write_zeroes": true, 00:14:01.971 "zcopy": true, 00:14:01.971 "get_zone_info": false, 00:14:01.971 "zone_management": false, 00:14:01.971 "zone_append": false, 00:14:01.971 "compare": false, 00:14:01.971 "compare_and_write": false, 00:14:01.971 "abort": true, 00:14:01.971 "seek_hole": false, 00:14:01.971 "seek_data": false, 00:14:01.971 "copy": true, 00:14:01.971 "nvme_iov_md": false 00:14:01.971 }, 00:14:01.971 "memory_domains": [ 00:14:01.971 { 00:14:01.971 "dma_device_id": "system", 00:14:01.971 "dma_device_type": 1 00:14:01.971 }, 00:14:01.971 { 00:14:01.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.971 "dma_device_type": 2 00:14:01.971 } 00:14:01.971 ], 00:14:01.971 "driver_specific": {} 00:14:01.971 }' 00:14:01.971 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.971 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.971 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:01.971 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.971 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:02.228 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:02.486 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:02.486 "name": "BaseBdev2", 00:14:02.486 "aliases": [ 00:14:02.486 "435eefb9-598e-405b-8041-02dce4a43b7d" 00:14:02.486 ], 00:14:02.486 "product_name": "Malloc disk", 00:14:02.486 "block_size": 512, 00:14:02.486 "num_blocks": 65536, 00:14:02.486 "uuid": "435eefb9-598e-405b-8041-02dce4a43b7d", 00:14:02.486 "assigned_rate_limits": { 00:14:02.486 "rw_ios_per_sec": 0, 00:14:02.486 "rw_mbytes_per_sec": 0, 00:14:02.486 "r_mbytes_per_sec": 0, 00:14:02.486 "w_mbytes_per_sec": 0 00:14:02.486 }, 00:14:02.486 "claimed": true, 00:14:02.486 "claim_type": "exclusive_write", 00:14:02.486 "zoned": false, 00:14:02.486 "supported_io_types": { 00:14:02.486 "read": true, 00:14:02.486 "write": true, 00:14:02.486 "unmap": true, 00:14:02.486 "flush": true, 00:14:02.486 "reset": true, 00:14:02.486 "nvme_admin": false, 00:14:02.486 "nvme_io": false, 00:14:02.486 "nvme_io_md": false, 00:14:02.486 "write_zeroes": true, 00:14:02.486 "zcopy": true, 00:14:02.486 "get_zone_info": false, 00:14:02.486 "zone_management": false, 00:14:02.486 "zone_append": false, 00:14:02.486 "compare": false, 00:14:02.486 "compare_and_write": false, 00:14:02.486 "abort": true, 00:14:02.486 "seek_hole": false, 00:14:02.486 "seek_data": false, 00:14:02.486 "copy": true, 00:14:02.486 "nvme_iov_md": false 00:14:02.486 }, 00:14:02.486 "memory_domains": [ 00:14:02.486 { 00:14:02.486 "dma_device_id": "system", 00:14:02.486 "dma_device_type": 1 00:14:02.486 }, 00:14:02.486 { 00:14:02.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.487 "dma_device_type": 2 00:14:02.487 } 00:14:02.487 ], 00:14:02.487 "driver_specific": {} 00:14:02.487 }' 00:14:02.487 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:02.487 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:02.744 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:02.744 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:02.744 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:02.744 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:02.744 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:02.744 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:02.744 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:02.744 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:02.744 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.003 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:03.003 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:03.003 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:03.003 10:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:03.003 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:03.003 "name": "BaseBdev3", 00:14:03.003 "aliases": [ 00:14:03.003 "ffd2042f-f471-47b5-8756-c7f3e5c81528" 00:14:03.003 ], 00:14:03.003 "product_name": "Malloc disk", 00:14:03.003 "block_size": 512, 00:14:03.003 "num_blocks": 65536, 00:14:03.003 "uuid": "ffd2042f-f471-47b5-8756-c7f3e5c81528", 00:14:03.003 "assigned_rate_limits": { 00:14:03.003 "rw_ios_per_sec": 0, 00:14:03.003 "rw_mbytes_per_sec": 0, 00:14:03.003 "r_mbytes_per_sec": 0, 00:14:03.003 "w_mbytes_per_sec": 0 00:14:03.003 }, 00:14:03.003 "claimed": true, 00:14:03.003 "claim_type": "exclusive_write", 00:14:03.003 "zoned": false, 00:14:03.003 "supported_io_types": { 00:14:03.003 "read": true, 00:14:03.003 "write": true, 00:14:03.003 "unmap": true, 00:14:03.003 "flush": true, 00:14:03.003 "reset": true, 00:14:03.003 "nvme_admin": false, 00:14:03.003 "nvme_io": false, 00:14:03.003 "nvme_io_md": false, 00:14:03.003 "write_zeroes": true, 00:14:03.003 "zcopy": true, 00:14:03.003 "get_zone_info": false, 00:14:03.003 "zone_management": false, 00:14:03.003 "zone_append": false, 00:14:03.003 "compare": false, 00:14:03.003 "compare_and_write": false, 00:14:03.003 "abort": true, 00:14:03.003 "seek_hole": false, 00:14:03.003 "seek_data": false, 00:14:03.003 "copy": true, 00:14:03.003 "nvme_iov_md": false 00:14:03.003 }, 00:14:03.003 "memory_domains": [ 00:14:03.003 { 00:14:03.003 "dma_device_id": "system", 00:14:03.003 "dma_device_type": 1 00:14:03.003 }, 00:14:03.003 { 00:14:03.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.003 "dma_device_type": 2 00:14:03.003 } 00:14:03.003 ], 00:14:03.003 "driver_specific": {} 00:14:03.003 }' 00:14:03.003 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.003 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.260 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:03.260 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.260 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.260 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:03.260 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.260 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.260 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:03.260 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.260 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.519 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:03.519 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:03.785 [2024-07-12 10:40:38.719043] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:03.785 [2024-07-12 10:40:38.719068] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:03.785 [2024-07-12 10:40:38.719117] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:03.785 [2024-07-12 10:40:38.719168] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:03.785 [2024-07-12 10:40:38.719179] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206be90 name Existed_Raid, state offline 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2040000 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2040000 ']' 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2040000 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2040000 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2040000' 00:14:03.785 killing process with pid 2040000 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2040000 00:14:03.785 [2024-07-12 10:40:38.788754] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:03.785 10:40:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2040000 00:14:03.785 [2024-07-12 10:40:38.814897] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:04.044 10:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:04.044 00:14:04.044 real 0m29.171s 00:14:04.044 user 0m53.513s 00:14:04.044 sys 0m5.286s 00:14:04.044 10:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:04.044 10:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:04.044 ************************************ 00:14:04.045 END TEST raid_state_function_test_sb 00:14:04.045 ************************************ 00:14:04.045 10:40:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:04.045 10:40:39 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:04.045 10:40:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:04.045 10:40:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:04.045 10:40:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:04.045 ************************************ 00:14:04.045 START TEST raid_superblock_test 00:14:04.045 ************************************ 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2044369 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2044369 /var/tmp/spdk-raid.sock 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2044369 ']' 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:04.045 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:04.045 10:40:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:04.045 [2024-07-12 10:40:39.158228] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:14:04.045 [2024-07-12 10:40:39.158290] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044369 ] 00:14:04.302 [2024-07-12 10:40:39.286196] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:04.302 [2024-07-12 10:40:39.392460] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:04.302 [2024-07-12 10:40:39.456637] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:04.302 [2024-07-12 10:40:39.456669] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:05.235 malloc1 00:14:05.235 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:05.493 [2024-07-12 10:40:40.570416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:05.493 [2024-07-12 10:40:40.570465] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:05.493 [2024-07-12 10:40:40.570493] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1962570 00:14:05.493 [2024-07-12 10:40:40.570506] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:05.493 [2024-07-12 10:40:40.572339] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:05.493 [2024-07-12 10:40:40.572368] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:05.493 pt1 00:14:05.493 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:05.493 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:05.493 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:05.493 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:05.493 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:05.493 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:05.493 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:05.493 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:05.493 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:05.750 malloc2 00:14:05.750 10:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:06.007 [2024-07-12 10:40:41.069768] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:06.007 [2024-07-12 10:40:41.069813] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:06.007 [2024-07-12 10:40:41.069831] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1963970 00:14:06.007 [2024-07-12 10:40:41.069851] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:06.007 [2024-07-12 10:40:41.071499] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:06.007 [2024-07-12 10:40:41.071526] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:06.007 pt2 00:14:06.007 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:06.007 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:06.007 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:06.007 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:06.007 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:06.007 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:06.007 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:06.007 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:06.007 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:06.264 malloc3 00:14:06.264 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:06.524 [2024-07-12 10:40:41.564930] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:06.524 [2024-07-12 10:40:41.564975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:06.524 [2024-07-12 10:40:41.564994] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1afa340 00:14:06.524 [2024-07-12 10:40:41.565006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:06.524 [2024-07-12 10:40:41.566553] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:06.524 [2024-07-12 10:40:41.566580] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:06.524 pt3 00:14:06.524 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:06.524 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:06.524 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:06.786 [2024-07-12 10:40:41.805603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:06.786 [2024-07-12 10:40:41.806953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:06.786 [2024-07-12 10:40:41.807007] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:06.786 [2024-07-12 10:40:41.807156] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x195aea0 00:14:06.786 [2024-07-12 10:40:41.807167] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:06.786 [2024-07-12 10:40:41.807371] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1962240 00:14:06.786 [2024-07-12 10:40:41.807526] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x195aea0 00:14:06.786 [2024-07-12 10:40:41.807537] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x195aea0 00:14:06.786 [2024-07-12 10:40:41.807635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.786 10:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:07.042 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.042 "name": "raid_bdev1", 00:14:07.042 "uuid": "9cac51bb-4560-4fde-a640-37b78cb0e409", 00:14:07.042 "strip_size_kb": 64, 00:14:07.042 "state": "online", 00:14:07.042 "raid_level": "raid0", 00:14:07.042 "superblock": true, 00:14:07.042 "num_base_bdevs": 3, 00:14:07.042 "num_base_bdevs_discovered": 3, 00:14:07.042 "num_base_bdevs_operational": 3, 00:14:07.042 "base_bdevs_list": [ 00:14:07.042 { 00:14:07.042 "name": "pt1", 00:14:07.042 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:07.042 "is_configured": true, 00:14:07.042 "data_offset": 2048, 00:14:07.042 "data_size": 63488 00:14:07.042 }, 00:14:07.042 { 00:14:07.042 "name": "pt2", 00:14:07.042 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:07.042 "is_configured": true, 00:14:07.042 "data_offset": 2048, 00:14:07.042 "data_size": 63488 00:14:07.042 }, 00:14:07.042 { 00:14:07.042 "name": "pt3", 00:14:07.042 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:07.042 "is_configured": true, 00:14:07.042 "data_offset": 2048, 00:14:07.042 "data_size": 63488 00:14:07.042 } 00:14:07.042 ] 00:14:07.042 }' 00:14:07.042 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.042 10:40:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:07.606 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:07.606 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:07.606 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:07.606 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:07.606 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:07.606 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:07.606 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:07.606 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:07.864 [2024-07-12 10:40:42.892727] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:07.864 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:07.864 "name": "raid_bdev1", 00:14:07.864 "aliases": [ 00:14:07.864 "9cac51bb-4560-4fde-a640-37b78cb0e409" 00:14:07.864 ], 00:14:07.864 "product_name": "Raid Volume", 00:14:07.864 "block_size": 512, 00:14:07.864 "num_blocks": 190464, 00:14:07.864 "uuid": "9cac51bb-4560-4fde-a640-37b78cb0e409", 00:14:07.864 "assigned_rate_limits": { 00:14:07.864 "rw_ios_per_sec": 0, 00:14:07.864 "rw_mbytes_per_sec": 0, 00:14:07.864 "r_mbytes_per_sec": 0, 00:14:07.864 "w_mbytes_per_sec": 0 00:14:07.864 }, 00:14:07.864 "claimed": false, 00:14:07.864 "zoned": false, 00:14:07.864 "supported_io_types": { 00:14:07.864 "read": true, 00:14:07.864 "write": true, 00:14:07.864 "unmap": true, 00:14:07.864 "flush": true, 00:14:07.864 "reset": true, 00:14:07.864 "nvme_admin": false, 00:14:07.864 "nvme_io": false, 00:14:07.864 "nvme_io_md": false, 00:14:07.864 "write_zeroes": true, 00:14:07.864 "zcopy": false, 00:14:07.864 "get_zone_info": false, 00:14:07.864 "zone_management": false, 00:14:07.864 "zone_append": false, 00:14:07.864 "compare": false, 00:14:07.864 "compare_and_write": false, 00:14:07.864 "abort": false, 00:14:07.864 "seek_hole": false, 00:14:07.864 "seek_data": false, 00:14:07.864 "copy": false, 00:14:07.864 "nvme_iov_md": false 00:14:07.864 }, 00:14:07.864 "memory_domains": [ 00:14:07.864 { 00:14:07.864 "dma_device_id": "system", 00:14:07.864 "dma_device_type": 1 00:14:07.864 }, 00:14:07.864 { 00:14:07.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.864 "dma_device_type": 2 00:14:07.864 }, 00:14:07.864 { 00:14:07.864 "dma_device_id": "system", 00:14:07.864 "dma_device_type": 1 00:14:07.864 }, 00:14:07.864 { 00:14:07.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.864 "dma_device_type": 2 00:14:07.864 }, 00:14:07.864 { 00:14:07.864 "dma_device_id": "system", 00:14:07.864 "dma_device_type": 1 00:14:07.864 }, 00:14:07.864 { 00:14:07.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.864 "dma_device_type": 2 00:14:07.864 } 00:14:07.864 ], 00:14:07.864 "driver_specific": { 00:14:07.864 "raid": { 00:14:07.864 "uuid": "9cac51bb-4560-4fde-a640-37b78cb0e409", 00:14:07.864 "strip_size_kb": 64, 00:14:07.864 "state": "online", 00:14:07.864 "raid_level": "raid0", 00:14:07.864 "superblock": true, 00:14:07.864 "num_base_bdevs": 3, 00:14:07.864 "num_base_bdevs_discovered": 3, 00:14:07.864 "num_base_bdevs_operational": 3, 00:14:07.864 "base_bdevs_list": [ 00:14:07.864 { 00:14:07.864 "name": "pt1", 00:14:07.864 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:07.864 "is_configured": true, 00:14:07.864 "data_offset": 2048, 00:14:07.864 "data_size": 63488 00:14:07.864 }, 00:14:07.864 { 00:14:07.864 "name": "pt2", 00:14:07.864 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:07.864 "is_configured": true, 00:14:07.864 "data_offset": 2048, 00:14:07.864 "data_size": 63488 00:14:07.864 }, 00:14:07.864 { 00:14:07.864 "name": "pt3", 00:14:07.864 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:07.864 "is_configured": true, 00:14:07.864 "data_offset": 2048, 00:14:07.864 "data_size": 63488 00:14:07.864 } 00:14:07.864 ] 00:14:07.864 } 00:14:07.864 } 00:14:07.864 }' 00:14:07.864 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:07.864 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:07.864 pt2 00:14:07.864 pt3' 00:14:07.864 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:07.864 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:07.864 10:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:08.122 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:08.122 "name": "pt1", 00:14:08.122 "aliases": [ 00:14:08.122 "00000000-0000-0000-0000-000000000001" 00:14:08.122 ], 00:14:08.122 "product_name": "passthru", 00:14:08.122 "block_size": 512, 00:14:08.122 "num_blocks": 65536, 00:14:08.122 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:08.122 "assigned_rate_limits": { 00:14:08.122 "rw_ios_per_sec": 0, 00:14:08.122 "rw_mbytes_per_sec": 0, 00:14:08.122 "r_mbytes_per_sec": 0, 00:14:08.122 "w_mbytes_per_sec": 0 00:14:08.122 }, 00:14:08.122 "claimed": true, 00:14:08.122 "claim_type": "exclusive_write", 00:14:08.122 "zoned": false, 00:14:08.122 "supported_io_types": { 00:14:08.122 "read": true, 00:14:08.122 "write": true, 00:14:08.122 "unmap": true, 00:14:08.122 "flush": true, 00:14:08.122 "reset": true, 00:14:08.122 "nvme_admin": false, 00:14:08.122 "nvme_io": false, 00:14:08.122 "nvme_io_md": false, 00:14:08.122 "write_zeroes": true, 00:14:08.122 "zcopy": true, 00:14:08.122 "get_zone_info": false, 00:14:08.122 "zone_management": false, 00:14:08.122 "zone_append": false, 00:14:08.122 "compare": false, 00:14:08.122 "compare_and_write": false, 00:14:08.122 "abort": true, 00:14:08.122 "seek_hole": false, 00:14:08.122 "seek_data": false, 00:14:08.122 "copy": true, 00:14:08.122 "nvme_iov_md": false 00:14:08.122 }, 00:14:08.122 "memory_domains": [ 00:14:08.122 { 00:14:08.122 "dma_device_id": "system", 00:14:08.122 "dma_device_type": 1 00:14:08.122 }, 00:14:08.122 { 00:14:08.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.122 "dma_device_type": 2 00:14:08.122 } 00:14:08.122 ], 00:14:08.122 "driver_specific": { 00:14:08.122 "passthru": { 00:14:08.122 "name": "pt1", 00:14:08.122 "base_bdev_name": "malloc1" 00:14:08.122 } 00:14:08.122 } 00:14:08.122 }' 00:14:08.122 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.122 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.122 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:08.122 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:08.379 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:08.637 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:08.637 "name": "pt2", 00:14:08.637 "aliases": [ 00:14:08.637 "00000000-0000-0000-0000-000000000002" 00:14:08.637 ], 00:14:08.637 "product_name": "passthru", 00:14:08.637 "block_size": 512, 00:14:08.637 "num_blocks": 65536, 00:14:08.637 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:08.637 "assigned_rate_limits": { 00:14:08.637 "rw_ios_per_sec": 0, 00:14:08.637 "rw_mbytes_per_sec": 0, 00:14:08.637 "r_mbytes_per_sec": 0, 00:14:08.637 "w_mbytes_per_sec": 0 00:14:08.637 }, 00:14:08.637 "claimed": true, 00:14:08.637 "claim_type": "exclusive_write", 00:14:08.637 "zoned": false, 00:14:08.637 "supported_io_types": { 00:14:08.637 "read": true, 00:14:08.637 "write": true, 00:14:08.637 "unmap": true, 00:14:08.637 "flush": true, 00:14:08.637 "reset": true, 00:14:08.637 "nvme_admin": false, 00:14:08.637 "nvme_io": false, 00:14:08.637 "nvme_io_md": false, 00:14:08.637 "write_zeroes": true, 00:14:08.637 "zcopy": true, 00:14:08.637 "get_zone_info": false, 00:14:08.637 "zone_management": false, 00:14:08.637 "zone_append": false, 00:14:08.637 "compare": false, 00:14:08.637 "compare_and_write": false, 00:14:08.637 "abort": true, 00:14:08.637 "seek_hole": false, 00:14:08.637 "seek_data": false, 00:14:08.637 "copy": true, 00:14:08.637 "nvme_iov_md": false 00:14:08.637 }, 00:14:08.637 "memory_domains": [ 00:14:08.637 { 00:14:08.637 "dma_device_id": "system", 00:14:08.637 "dma_device_type": 1 00:14:08.637 }, 00:14:08.637 { 00:14:08.637 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.637 "dma_device_type": 2 00:14:08.637 } 00:14:08.637 ], 00:14:08.637 "driver_specific": { 00:14:08.637 "passthru": { 00:14:08.637 "name": "pt2", 00:14:08.637 "base_bdev_name": "malloc2" 00:14:08.637 } 00:14:08.637 } 00:14:08.637 }' 00:14:08.637 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.637 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.895 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:08.895 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.895 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.895 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:08.895 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.895 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.895 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:08.895 10:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.895 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.895 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:08.895 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:08.895 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:08.895 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:09.152 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:09.152 "name": "pt3", 00:14:09.152 "aliases": [ 00:14:09.152 "00000000-0000-0000-0000-000000000003" 00:14:09.152 ], 00:14:09.152 "product_name": "passthru", 00:14:09.152 "block_size": 512, 00:14:09.152 "num_blocks": 65536, 00:14:09.152 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:09.152 "assigned_rate_limits": { 00:14:09.152 "rw_ios_per_sec": 0, 00:14:09.152 "rw_mbytes_per_sec": 0, 00:14:09.152 "r_mbytes_per_sec": 0, 00:14:09.152 "w_mbytes_per_sec": 0 00:14:09.152 }, 00:14:09.152 "claimed": true, 00:14:09.152 "claim_type": "exclusive_write", 00:14:09.152 "zoned": false, 00:14:09.152 "supported_io_types": { 00:14:09.152 "read": true, 00:14:09.152 "write": true, 00:14:09.152 "unmap": true, 00:14:09.152 "flush": true, 00:14:09.152 "reset": true, 00:14:09.152 "nvme_admin": false, 00:14:09.152 "nvme_io": false, 00:14:09.152 "nvme_io_md": false, 00:14:09.152 "write_zeroes": true, 00:14:09.152 "zcopy": true, 00:14:09.152 "get_zone_info": false, 00:14:09.152 "zone_management": false, 00:14:09.152 "zone_append": false, 00:14:09.152 "compare": false, 00:14:09.152 "compare_and_write": false, 00:14:09.152 "abort": true, 00:14:09.152 "seek_hole": false, 00:14:09.152 "seek_data": false, 00:14:09.152 "copy": true, 00:14:09.152 "nvme_iov_md": false 00:14:09.152 }, 00:14:09.152 "memory_domains": [ 00:14:09.152 { 00:14:09.152 "dma_device_id": "system", 00:14:09.152 "dma_device_type": 1 00:14:09.152 }, 00:14:09.152 { 00:14:09.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.152 "dma_device_type": 2 00:14:09.152 } 00:14:09.152 ], 00:14:09.152 "driver_specific": { 00:14:09.152 "passthru": { 00:14:09.152 "name": "pt3", 00:14:09.152 "base_bdev_name": "malloc3" 00:14:09.152 } 00:14:09.152 } 00:14:09.152 }' 00:14:09.152 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.409 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.409 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:09.409 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.409 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.409 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:09.409 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.409 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.409 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:09.409 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.666 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.666 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:09.666 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:09.666 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:09.923 [2024-07-12 10:40:44.906034] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:09.923 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9cac51bb-4560-4fde-a640-37b78cb0e409 00:14:09.923 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9cac51bb-4560-4fde-a640-37b78cb0e409 ']' 00:14:09.923 10:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:10.181 [2024-07-12 10:40:45.154427] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:10.181 [2024-07-12 10:40:45.154449] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:10.181 [2024-07-12 10:40:45.154504] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:10.181 [2024-07-12 10:40:45.154557] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:10.181 [2024-07-12 10:40:45.154569] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195aea0 name raid_bdev1, state offline 00:14:10.181 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.181 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:10.181 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:10.181 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:10.181 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:10.181 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:10.438 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:10.438 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:10.695 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:10.695 10:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:10.953 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:10.953 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:11.212 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:11.471 [2024-07-12 10:40:46.445780] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:11.471 [2024-07-12 10:40:46.447168] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:11.471 [2024-07-12 10:40:46.447211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:11.471 [2024-07-12 10:40:46.447255] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:11.471 [2024-07-12 10:40:46.447295] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:11.471 [2024-07-12 10:40:46.447318] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:11.471 [2024-07-12 10:40:46.447336] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:11.471 [2024-07-12 10:40:46.447346] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b05ff0 name raid_bdev1, state configuring 00:14:11.471 request: 00:14:11.471 { 00:14:11.471 "name": "raid_bdev1", 00:14:11.471 "raid_level": "raid0", 00:14:11.471 "base_bdevs": [ 00:14:11.471 "malloc1", 00:14:11.471 "malloc2", 00:14:11.471 "malloc3" 00:14:11.471 ], 00:14:11.471 "strip_size_kb": 64, 00:14:11.471 "superblock": false, 00:14:11.471 "method": "bdev_raid_create", 00:14:11.471 "req_id": 1 00:14:11.471 } 00:14:11.471 Got JSON-RPC error response 00:14:11.471 response: 00:14:11.471 { 00:14:11.471 "code": -17, 00:14:11.471 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:11.471 } 00:14:11.471 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:11.471 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:11.471 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:11.471 10:40:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:11.471 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.471 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:11.471 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:11.471 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:11.471 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:11.729 [2024-07-12 10:40:46.850797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:11.729 [2024-07-12 10:40:46.850837] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:11.729 [2024-07-12 10:40:46.850859] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19627a0 00:14:11.729 [2024-07-12 10:40:46.850871] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:11.729 [2024-07-12 10:40:46.852499] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:11.729 [2024-07-12 10:40:46.852526] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:11.729 [2024-07-12 10:40:46.852589] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:11.729 [2024-07-12 10:40:46.852614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:11.729 pt1 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.729 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.730 10:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:11.988 10:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.988 "name": "raid_bdev1", 00:14:11.988 "uuid": "9cac51bb-4560-4fde-a640-37b78cb0e409", 00:14:11.988 "strip_size_kb": 64, 00:14:11.988 "state": "configuring", 00:14:11.988 "raid_level": "raid0", 00:14:11.988 "superblock": true, 00:14:11.988 "num_base_bdevs": 3, 00:14:11.988 "num_base_bdevs_discovered": 1, 00:14:11.988 "num_base_bdevs_operational": 3, 00:14:11.988 "base_bdevs_list": [ 00:14:11.988 { 00:14:11.988 "name": "pt1", 00:14:11.988 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:11.988 "is_configured": true, 00:14:11.988 "data_offset": 2048, 00:14:11.988 "data_size": 63488 00:14:11.988 }, 00:14:11.988 { 00:14:11.988 "name": null, 00:14:11.988 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:11.988 "is_configured": false, 00:14:11.988 "data_offset": 2048, 00:14:11.988 "data_size": 63488 00:14:11.988 }, 00:14:11.988 { 00:14:11.988 "name": null, 00:14:11.988 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:11.988 "is_configured": false, 00:14:11.988 "data_offset": 2048, 00:14:11.988 "data_size": 63488 00:14:11.988 } 00:14:11.988 ] 00:14:11.988 }' 00:14:11.988 10:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.988 10:40:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.556 10:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:12.556 10:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:12.816 [2024-07-12 10:40:47.921635] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:12.816 [2024-07-12 10:40:47.921684] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:12.816 [2024-07-12 10:40:47.921703] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1959c70 00:14:12.816 [2024-07-12 10:40:47.921715] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:12.816 [2024-07-12 10:40:47.922045] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:12.816 [2024-07-12 10:40:47.922062] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:12.816 [2024-07-12 10:40:47.922127] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:12.816 [2024-07-12 10:40:47.922145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:12.816 pt2 00:14:12.816 10:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:13.076 [2024-07-12 10:40:48.166296] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.076 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:13.335 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.335 "name": "raid_bdev1", 00:14:13.335 "uuid": "9cac51bb-4560-4fde-a640-37b78cb0e409", 00:14:13.335 "strip_size_kb": 64, 00:14:13.335 "state": "configuring", 00:14:13.335 "raid_level": "raid0", 00:14:13.335 "superblock": true, 00:14:13.335 "num_base_bdevs": 3, 00:14:13.335 "num_base_bdevs_discovered": 1, 00:14:13.335 "num_base_bdevs_operational": 3, 00:14:13.335 "base_bdevs_list": [ 00:14:13.335 { 00:14:13.335 "name": "pt1", 00:14:13.335 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:13.335 "is_configured": true, 00:14:13.335 "data_offset": 2048, 00:14:13.335 "data_size": 63488 00:14:13.335 }, 00:14:13.335 { 00:14:13.335 "name": null, 00:14:13.335 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:13.335 "is_configured": false, 00:14:13.335 "data_offset": 2048, 00:14:13.335 "data_size": 63488 00:14:13.335 }, 00:14:13.335 { 00:14:13.335 "name": null, 00:14:13.335 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:13.335 "is_configured": false, 00:14:13.335 "data_offset": 2048, 00:14:13.335 "data_size": 63488 00:14:13.335 } 00:14:13.335 ] 00:14:13.335 }' 00:14:13.335 10:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.336 10:40:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.903 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:13.903 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:13.903 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:14.163 [2024-07-12 10:40:49.205040] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:14.163 [2024-07-12 10:40:49.205090] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:14.163 [2024-07-12 10:40:49.205113] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1afafa0 00:14:14.163 [2024-07-12 10:40:49.205125] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:14.163 [2024-07-12 10:40:49.205454] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:14.163 [2024-07-12 10:40:49.205470] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:14.163 [2024-07-12 10:40:49.205536] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:14.163 [2024-07-12 10:40:49.205555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:14.163 pt2 00:14:14.163 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:14.163 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:14.163 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:14.422 [2024-07-12 10:40:49.449685] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:14.422 [2024-07-12 10:40:49.449717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:14.422 [2024-07-12 10:40:49.449734] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1afbb30 00:14:14.422 [2024-07-12 10:40:49.449746] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:14.422 [2024-07-12 10:40:49.450035] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:14.422 [2024-07-12 10:40:49.450052] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:14.422 [2024-07-12 10:40:49.450104] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:14.422 [2024-07-12 10:40:49.450121] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:14.422 [2024-07-12 10:40:49.450222] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1afcc00 00:14:14.422 [2024-07-12 10:40:49.450232] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:14.422 [2024-07-12 10:40:49.450398] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b059b0 00:14:14.422 [2024-07-12 10:40:49.450529] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1afcc00 00:14:14.422 [2024-07-12 10:40:49.450539] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1afcc00 00:14:14.422 [2024-07-12 10:40:49.450634] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:14.422 pt3 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:14.422 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.681 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:14.681 "name": "raid_bdev1", 00:14:14.681 "uuid": "9cac51bb-4560-4fde-a640-37b78cb0e409", 00:14:14.681 "strip_size_kb": 64, 00:14:14.681 "state": "online", 00:14:14.681 "raid_level": "raid0", 00:14:14.681 "superblock": true, 00:14:14.681 "num_base_bdevs": 3, 00:14:14.681 "num_base_bdevs_discovered": 3, 00:14:14.681 "num_base_bdevs_operational": 3, 00:14:14.681 "base_bdevs_list": [ 00:14:14.681 { 00:14:14.681 "name": "pt1", 00:14:14.681 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:14.681 "is_configured": true, 00:14:14.681 "data_offset": 2048, 00:14:14.681 "data_size": 63488 00:14:14.681 }, 00:14:14.681 { 00:14:14.681 "name": "pt2", 00:14:14.681 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:14.681 "is_configured": true, 00:14:14.681 "data_offset": 2048, 00:14:14.681 "data_size": 63488 00:14:14.681 }, 00:14:14.681 { 00:14:14.681 "name": "pt3", 00:14:14.682 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:14.682 "is_configured": true, 00:14:14.682 "data_offset": 2048, 00:14:14.682 "data_size": 63488 00:14:14.682 } 00:14:14.682 ] 00:14:14.682 }' 00:14:14.682 10:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:14.682 10:40:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:15.249 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:15.249 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:15.249 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:15.249 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:15.249 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:15.249 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:15.249 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:15.249 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:15.508 [2024-07-12 10:40:50.532837] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:15.508 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:15.508 "name": "raid_bdev1", 00:14:15.508 "aliases": [ 00:14:15.508 "9cac51bb-4560-4fde-a640-37b78cb0e409" 00:14:15.508 ], 00:14:15.508 "product_name": "Raid Volume", 00:14:15.508 "block_size": 512, 00:14:15.508 "num_blocks": 190464, 00:14:15.508 "uuid": "9cac51bb-4560-4fde-a640-37b78cb0e409", 00:14:15.508 "assigned_rate_limits": { 00:14:15.508 "rw_ios_per_sec": 0, 00:14:15.508 "rw_mbytes_per_sec": 0, 00:14:15.508 "r_mbytes_per_sec": 0, 00:14:15.508 "w_mbytes_per_sec": 0 00:14:15.508 }, 00:14:15.508 "claimed": false, 00:14:15.508 "zoned": false, 00:14:15.508 "supported_io_types": { 00:14:15.508 "read": true, 00:14:15.508 "write": true, 00:14:15.508 "unmap": true, 00:14:15.508 "flush": true, 00:14:15.508 "reset": true, 00:14:15.508 "nvme_admin": false, 00:14:15.508 "nvme_io": false, 00:14:15.508 "nvme_io_md": false, 00:14:15.508 "write_zeroes": true, 00:14:15.508 "zcopy": false, 00:14:15.508 "get_zone_info": false, 00:14:15.508 "zone_management": false, 00:14:15.508 "zone_append": false, 00:14:15.508 "compare": false, 00:14:15.508 "compare_and_write": false, 00:14:15.508 "abort": false, 00:14:15.508 "seek_hole": false, 00:14:15.508 "seek_data": false, 00:14:15.508 "copy": false, 00:14:15.508 "nvme_iov_md": false 00:14:15.508 }, 00:14:15.508 "memory_domains": [ 00:14:15.508 { 00:14:15.508 "dma_device_id": "system", 00:14:15.508 "dma_device_type": 1 00:14:15.508 }, 00:14:15.508 { 00:14:15.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.508 "dma_device_type": 2 00:14:15.508 }, 00:14:15.508 { 00:14:15.508 "dma_device_id": "system", 00:14:15.508 "dma_device_type": 1 00:14:15.508 }, 00:14:15.508 { 00:14:15.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.508 "dma_device_type": 2 00:14:15.508 }, 00:14:15.508 { 00:14:15.508 "dma_device_id": "system", 00:14:15.508 "dma_device_type": 1 00:14:15.508 }, 00:14:15.508 { 00:14:15.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.508 "dma_device_type": 2 00:14:15.508 } 00:14:15.508 ], 00:14:15.508 "driver_specific": { 00:14:15.508 "raid": { 00:14:15.508 "uuid": "9cac51bb-4560-4fde-a640-37b78cb0e409", 00:14:15.508 "strip_size_kb": 64, 00:14:15.508 "state": "online", 00:14:15.508 "raid_level": "raid0", 00:14:15.508 "superblock": true, 00:14:15.508 "num_base_bdevs": 3, 00:14:15.508 "num_base_bdevs_discovered": 3, 00:14:15.508 "num_base_bdevs_operational": 3, 00:14:15.508 "base_bdevs_list": [ 00:14:15.508 { 00:14:15.508 "name": "pt1", 00:14:15.508 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:15.508 "is_configured": true, 00:14:15.508 "data_offset": 2048, 00:14:15.508 "data_size": 63488 00:14:15.508 }, 00:14:15.508 { 00:14:15.508 "name": "pt2", 00:14:15.508 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:15.508 "is_configured": true, 00:14:15.508 "data_offset": 2048, 00:14:15.508 "data_size": 63488 00:14:15.508 }, 00:14:15.508 { 00:14:15.508 "name": "pt3", 00:14:15.508 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:15.508 "is_configured": true, 00:14:15.508 "data_offset": 2048, 00:14:15.508 "data_size": 63488 00:14:15.508 } 00:14:15.508 ] 00:14:15.508 } 00:14:15.508 } 00:14:15.508 }' 00:14:15.508 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:15.508 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:15.508 pt2 00:14:15.508 pt3' 00:14:15.508 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:15.508 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:15.508 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:15.767 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:15.767 "name": "pt1", 00:14:15.767 "aliases": [ 00:14:15.767 "00000000-0000-0000-0000-000000000001" 00:14:15.767 ], 00:14:15.767 "product_name": "passthru", 00:14:15.767 "block_size": 512, 00:14:15.767 "num_blocks": 65536, 00:14:15.767 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:15.767 "assigned_rate_limits": { 00:14:15.767 "rw_ios_per_sec": 0, 00:14:15.767 "rw_mbytes_per_sec": 0, 00:14:15.767 "r_mbytes_per_sec": 0, 00:14:15.767 "w_mbytes_per_sec": 0 00:14:15.767 }, 00:14:15.767 "claimed": true, 00:14:15.767 "claim_type": "exclusive_write", 00:14:15.767 "zoned": false, 00:14:15.767 "supported_io_types": { 00:14:15.767 "read": true, 00:14:15.767 "write": true, 00:14:15.767 "unmap": true, 00:14:15.767 "flush": true, 00:14:15.767 "reset": true, 00:14:15.767 "nvme_admin": false, 00:14:15.767 "nvme_io": false, 00:14:15.767 "nvme_io_md": false, 00:14:15.767 "write_zeroes": true, 00:14:15.767 "zcopy": true, 00:14:15.767 "get_zone_info": false, 00:14:15.767 "zone_management": false, 00:14:15.767 "zone_append": false, 00:14:15.767 "compare": false, 00:14:15.767 "compare_and_write": false, 00:14:15.767 "abort": true, 00:14:15.767 "seek_hole": false, 00:14:15.767 "seek_data": false, 00:14:15.767 "copy": true, 00:14:15.767 "nvme_iov_md": false 00:14:15.767 }, 00:14:15.767 "memory_domains": [ 00:14:15.767 { 00:14:15.767 "dma_device_id": "system", 00:14:15.767 "dma_device_type": 1 00:14:15.767 }, 00:14:15.767 { 00:14:15.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.767 "dma_device_type": 2 00:14:15.767 } 00:14:15.767 ], 00:14:15.767 "driver_specific": { 00:14:15.767 "passthru": { 00:14:15.767 "name": "pt1", 00:14:15.767 "base_bdev_name": "malloc1" 00:14:15.767 } 00:14:15.767 } 00:14:15.767 }' 00:14:15.767 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.767 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.767 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:15.767 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.767 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.767 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:15.767 10:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.026 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.026 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:16.026 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.026 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.026 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:16.026 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:16.026 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:16.026 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:16.285 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:16.285 "name": "pt2", 00:14:16.285 "aliases": [ 00:14:16.285 "00000000-0000-0000-0000-000000000002" 00:14:16.285 ], 00:14:16.285 "product_name": "passthru", 00:14:16.285 "block_size": 512, 00:14:16.285 "num_blocks": 65536, 00:14:16.285 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:16.285 "assigned_rate_limits": { 00:14:16.285 "rw_ios_per_sec": 0, 00:14:16.285 "rw_mbytes_per_sec": 0, 00:14:16.285 "r_mbytes_per_sec": 0, 00:14:16.285 "w_mbytes_per_sec": 0 00:14:16.285 }, 00:14:16.285 "claimed": true, 00:14:16.285 "claim_type": "exclusive_write", 00:14:16.285 "zoned": false, 00:14:16.285 "supported_io_types": { 00:14:16.285 "read": true, 00:14:16.285 "write": true, 00:14:16.285 "unmap": true, 00:14:16.285 "flush": true, 00:14:16.285 "reset": true, 00:14:16.285 "nvme_admin": false, 00:14:16.285 "nvme_io": false, 00:14:16.285 "nvme_io_md": false, 00:14:16.285 "write_zeroes": true, 00:14:16.285 "zcopy": true, 00:14:16.285 "get_zone_info": false, 00:14:16.285 "zone_management": false, 00:14:16.285 "zone_append": false, 00:14:16.285 "compare": false, 00:14:16.285 "compare_and_write": false, 00:14:16.285 "abort": true, 00:14:16.285 "seek_hole": false, 00:14:16.285 "seek_data": false, 00:14:16.285 "copy": true, 00:14:16.285 "nvme_iov_md": false 00:14:16.285 }, 00:14:16.285 "memory_domains": [ 00:14:16.285 { 00:14:16.285 "dma_device_id": "system", 00:14:16.285 "dma_device_type": 1 00:14:16.285 }, 00:14:16.285 { 00:14:16.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.285 "dma_device_type": 2 00:14:16.285 } 00:14:16.285 ], 00:14:16.285 "driver_specific": { 00:14:16.285 "passthru": { 00:14:16.285 "name": "pt2", 00:14:16.285 "base_bdev_name": "malloc2" 00:14:16.285 } 00:14:16.285 } 00:14:16.285 }' 00:14:16.285 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.285 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.285 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:16.285 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:16.543 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:16.802 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:16.802 "name": "pt3", 00:14:16.802 "aliases": [ 00:14:16.802 "00000000-0000-0000-0000-000000000003" 00:14:16.802 ], 00:14:16.802 "product_name": "passthru", 00:14:16.802 "block_size": 512, 00:14:16.802 "num_blocks": 65536, 00:14:16.802 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:16.802 "assigned_rate_limits": { 00:14:16.802 "rw_ios_per_sec": 0, 00:14:16.802 "rw_mbytes_per_sec": 0, 00:14:16.802 "r_mbytes_per_sec": 0, 00:14:16.802 "w_mbytes_per_sec": 0 00:14:16.802 }, 00:14:16.802 "claimed": true, 00:14:16.802 "claim_type": "exclusive_write", 00:14:16.802 "zoned": false, 00:14:16.802 "supported_io_types": { 00:14:16.802 "read": true, 00:14:16.802 "write": true, 00:14:16.802 "unmap": true, 00:14:16.802 "flush": true, 00:14:16.802 "reset": true, 00:14:16.802 "nvme_admin": false, 00:14:16.802 "nvme_io": false, 00:14:16.802 "nvme_io_md": false, 00:14:16.802 "write_zeroes": true, 00:14:16.802 "zcopy": true, 00:14:16.802 "get_zone_info": false, 00:14:16.802 "zone_management": false, 00:14:16.802 "zone_append": false, 00:14:16.802 "compare": false, 00:14:16.802 "compare_and_write": false, 00:14:16.802 "abort": true, 00:14:16.802 "seek_hole": false, 00:14:16.802 "seek_data": false, 00:14:16.802 "copy": true, 00:14:16.802 "nvme_iov_md": false 00:14:16.802 }, 00:14:16.802 "memory_domains": [ 00:14:16.802 { 00:14:16.802 "dma_device_id": "system", 00:14:16.802 "dma_device_type": 1 00:14:16.802 }, 00:14:16.802 { 00:14:16.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.802 "dma_device_type": 2 00:14:16.802 } 00:14:16.802 ], 00:14:16.802 "driver_specific": { 00:14:16.802 "passthru": { 00:14:16.802 "name": "pt3", 00:14:16.802 "base_bdev_name": "malloc3" 00:14:16.802 } 00:14:16.802 } 00:14:16.802 }' 00:14:16.802 10:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.061 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.061 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.061 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.061 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.061 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:17.061 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.061 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.061 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:17.061 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.320 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.320 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:17.320 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:17.320 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:17.579 [2024-07-12 10:40:52.550184] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9cac51bb-4560-4fde-a640-37b78cb0e409 '!=' 9cac51bb-4560-4fde-a640-37b78cb0e409 ']' 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2044369 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2044369 ']' 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2044369 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2044369 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2044369' 00:14:17.579 killing process with pid 2044369 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2044369 00:14:17.579 [2024-07-12 10:40:52.619709] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:17.579 [2024-07-12 10:40:52.619763] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:17.579 [2024-07-12 10:40:52.619813] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:17.579 [2024-07-12 10:40:52.619824] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1afcc00 name raid_bdev1, state offline 00:14:17.579 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2044369 00:14:17.579 [2024-07-12 10:40:52.648176] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:17.839 10:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:17.839 00:14:17.839 real 0m13.774s 00:14:17.839 user 0m24.742s 00:14:17.839 sys 0m2.505s 00:14:17.839 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:17.839 10:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.839 ************************************ 00:14:17.839 END TEST raid_superblock_test 00:14:17.839 ************************************ 00:14:17.839 10:40:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:17.839 10:40:52 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:17.839 10:40:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:17.839 10:40:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:17.839 10:40:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:17.839 ************************************ 00:14:17.839 START TEST raid_read_error_test 00:14:17.839 ************************************ 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.OJjtTGnoNi 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2046432 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2046432 /var/tmp/spdk-raid.sock 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2046432 ']' 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:17.839 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:17.839 10:40:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.097 [2024-07-12 10:40:53.036351] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:14:18.097 [2024-07-12 10:40:53.036426] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2046432 ] 00:14:18.097 [2024-07-12 10:40:53.166883] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:18.097 [2024-07-12 10:40:53.265677] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:18.356 [2024-07-12 10:40:53.329972] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:18.356 [2024-07-12 10:40:53.330010] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:18.922 10:40:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:18.922 10:40:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:18.922 10:40:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:18.922 10:40:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:18.922 BaseBdev1_malloc 00:14:18.922 10:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:19.180 true 00:14:19.180 10:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:19.438 [2024-07-12 10:40:54.399459] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:19.438 [2024-07-12 10:40:54.399509] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:19.438 [2024-07-12 10:40:54.399529] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14f40d0 00:14:19.438 [2024-07-12 10:40:54.399541] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:19.438 [2024-07-12 10:40:54.401240] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:19.438 [2024-07-12 10:40:54.401267] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:19.438 BaseBdev1 00:14:19.438 10:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:19.438 10:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:19.697 BaseBdev2_malloc 00:14:19.697 10:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:19.697 true 00:14:19.697 10:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:19.955 [2024-07-12 10:40:55.009802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:19.955 [2024-07-12 10:40:55.009845] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:19.955 [2024-07-12 10:40:55.009864] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14f8910 00:14:19.955 [2024-07-12 10:40:55.009877] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:19.955 [2024-07-12 10:40:55.011254] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:19.955 [2024-07-12 10:40:55.011280] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:19.955 BaseBdev2 00:14:19.955 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:19.955 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:20.215 BaseBdev3_malloc 00:14:20.215 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:20.215 true 00:14:20.215 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:20.488 [2024-07-12 10:40:55.595855] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:20.488 [2024-07-12 10:40:55.595899] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:20.488 [2024-07-12 10:40:55.595918] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14fabd0 00:14:20.488 [2024-07-12 10:40:55.595930] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:20.488 [2024-07-12 10:40:55.597294] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:20.488 [2024-07-12 10:40:55.597319] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:20.488 BaseBdev3 00:14:20.488 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:20.776 [2024-07-12 10:40:55.840573] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:20.776 [2024-07-12 10:40:55.841975] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:20.776 [2024-07-12 10:40:55.842048] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:20.776 [2024-07-12 10:40:55.842263] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14fc280 00:14:20.776 [2024-07-12 10:40:55.842275] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:20.776 [2024-07-12 10:40:55.842501] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14fbe20 00:14:20.776 [2024-07-12 10:40:55.842656] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14fc280 00:14:20.776 [2024-07-12 10:40:55.842666] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14fc280 00:14:20.776 [2024-07-12 10:40:55.842775] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.776 10:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:21.035 10:40:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.035 "name": "raid_bdev1", 00:14:21.035 "uuid": "efb2850a-fdef-46df-8231-244945700540", 00:14:21.035 "strip_size_kb": 64, 00:14:21.035 "state": "online", 00:14:21.035 "raid_level": "raid0", 00:14:21.035 "superblock": true, 00:14:21.035 "num_base_bdevs": 3, 00:14:21.035 "num_base_bdevs_discovered": 3, 00:14:21.035 "num_base_bdevs_operational": 3, 00:14:21.035 "base_bdevs_list": [ 00:14:21.035 { 00:14:21.035 "name": "BaseBdev1", 00:14:21.035 "uuid": "96d4b352-d734-5c5e-a4e9-cf8123a518d8", 00:14:21.035 "is_configured": true, 00:14:21.035 "data_offset": 2048, 00:14:21.035 "data_size": 63488 00:14:21.035 }, 00:14:21.035 { 00:14:21.035 "name": "BaseBdev2", 00:14:21.035 "uuid": "87e952a5-643d-5579-9c77-8dc12a8a482a", 00:14:21.035 "is_configured": true, 00:14:21.035 "data_offset": 2048, 00:14:21.035 "data_size": 63488 00:14:21.035 }, 00:14:21.035 { 00:14:21.035 "name": "BaseBdev3", 00:14:21.035 "uuid": "dea7a780-b91b-5046-904a-908763b5ad9e", 00:14:21.035 "is_configured": true, 00:14:21.035 "data_offset": 2048, 00:14:21.035 "data_size": 63488 00:14:21.035 } 00:14:21.035 ] 00:14:21.035 }' 00:14:21.035 10:40:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.035 10:40:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.604 10:40:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:21.604 10:40:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:21.863 [2024-07-12 10:40:56.799367] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x134a5b0 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.801 10:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:23.061 10:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.061 "name": "raid_bdev1", 00:14:23.061 "uuid": "efb2850a-fdef-46df-8231-244945700540", 00:14:23.061 "strip_size_kb": 64, 00:14:23.061 "state": "online", 00:14:23.061 "raid_level": "raid0", 00:14:23.061 "superblock": true, 00:14:23.061 "num_base_bdevs": 3, 00:14:23.061 "num_base_bdevs_discovered": 3, 00:14:23.061 "num_base_bdevs_operational": 3, 00:14:23.061 "base_bdevs_list": [ 00:14:23.061 { 00:14:23.061 "name": "BaseBdev1", 00:14:23.061 "uuid": "96d4b352-d734-5c5e-a4e9-cf8123a518d8", 00:14:23.061 "is_configured": true, 00:14:23.061 "data_offset": 2048, 00:14:23.061 "data_size": 63488 00:14:23.061 }, 00:14:23.061 { 00:14:23.061 "name": "BaseBdev2", 00:14:23.061 "uuid": "87e952a5-643d-5579-9c77-8dc12a8a482a", 00:14:23.061 "is_configured": true, 00:14:23.061 "data_offset": 2048, 00:14:23.061 "data_size": 63488 00:14:23.061 }, 00:14:23.061 { 00:14:23.061 "name": "BaseBdev3", 00:14:23.061 "uuid": "dea7a780-b91b-5046-904a-908763b5ad9e", 00:14:23.061 "is_configured": true, 00:14:23.061 "data_offset": 2048, 00:14:23.061 "data_size": 63488 00:14:23.061 } 00:14:23.061 ] 00:14:23.061 }' 00:14:23.061 10:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.061 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.629 10:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:23.889 [2024-07-12 10:40:58.931914] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:23.889 [2024-07-12 10:40:58.931958] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:23.889 [2024-07-12 10:40:58.935126] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:23.889 [2024-07-12 10:40:58.935164] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:23.889 [2024-07-12 10:40:58.935199] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:23.889 [2024-07-12 10:40:58.935217] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14fc280 name raid_bdev1, state offline 00:14:23.889 0 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2046432 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2046432 ']' 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2046432 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2046432 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2046432' 00:14:23.889 killing process with pid 2046432 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2046432 00:14:23.889 [2024-07-12 10:40:58.999237] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:23.889 10:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2046432 00:14:23.889 [2024-07-12 10:40:59.020583] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.OJjtTGnoNi 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:14:24.148 00:14:24.148 real 0m6.304s 00:14:24.148 user 0m9.813s 00:14:24.148 sys 0m1.135s 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:24.148 10:40:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:24.148 ************************************ 00:14:24.148 END TEST raid_read_error_test 00:14:24.148 ************************************ 00:14:24.148 10:40:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:24.148 10:40:59 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:14:24.148 10:40:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:24.148 10:40:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:24.148 10:40:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:24.408 ************************************ 00:14:24.408 START TEST raid_write_error_test 00:14:24.408 ************************************ 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.6ELrwYoJIT 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2047397 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2047397 /var/tmp/spdk-raid.sock 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2047397 ']' 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:24.408 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:24.408 10:40:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:24.408 [2024-07-12 10:40:59.395286] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:14:24.408 [2024-07-12 10:40:59.395330] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047397 ] 00:14:24.408 [2024-07-12 10:40:59.508850] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:24.667 [2024-07-12 10:40:59.618021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:24.667 [2024-07-12 10:40:59.685591] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:24.667 [2024-07-12 10:40:59.685624] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:25.234 10:41:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:25.234 10:41:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:25.235 10:41:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:25.235 10:41:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:25.494 BaseBdev1_malloc 00:14:25.494 10:41:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:25.753 true 00:14:25.753 10:41:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:26.012 [2024-07-12 10:41:01.073031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:26.012 [2024-07-12 10:41:01.073079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:26.012 [2024-07-12 10:41:01.073103] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aaf0d0 00:14:26.012 [2024-07-12 10:41:01.073116] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:26.012 [2024-07-12 10:41:01.075024] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:26.012 [2024-07-12 10:41:01.075056] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:26.012 BaseBdev1 00:14:26.012 10:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:26.012 10:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:26.271 BaseBdev2_malloc 00:14:26.271 10:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:26.531 true 00:14:26.531 10:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:26.799 [2024-07-12 10:41:01.799581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:26.799 [2024-07-12 10:41:01.799623] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:26.799 [2024-07-12 10:41:01.799645] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab3910 00:14:26.799 [2024-07-12 10:41:01.799657] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:26.799 [2024-07-12 10:41:01.801231] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:26.799 [2024-07-12 10:41:01.801258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:26.799 BaseBdev2 00:14:26.799 10:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:26.799 10:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:27.056 BaseBdev3_malloc 00:14:27.056 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:27.314 true 00:14:27.314 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:27.573 [2024-07-12 10:41:02.535308] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:27.573 [2024-07-12 10:41:02.535353] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:27.573 [2024-07-12 10:41:02.535375] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab5bd0 00:14:27.574 [2024-07-12 10:41:02.535390] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:27.574 [2024-07-12 10:41:02.536988] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:27.574 [2024-07-12 10:41:02.537018] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:27.574 BaseBdev3 00:14:27.574 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:27.574 [2024-07-12 10:41:02.767955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:27.832 [2024-07-12 10:41:02.769346] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:27.832 [2024-07-12 10:41:02.769419] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:27.832 [2024-07-12 10:41:02.769657] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ab7280 00:14:27.832 [2024-07-12 10:41:02.769671] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:27.832 [2024-07-12 10:41:02.769871] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ab6e20 00:14:27.832 [2024-07-12 10:41:02.770021] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ab7280 00:14:27.832 [2024-07-12 10:41:02.770032] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ab7280 00:14:27.832 [2024-07-12 10:41:02.770151] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.832 10:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:28.091 10:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.091 "name": "raid_bdev1", 00:14:28.091 "uuid": "fa800a90-bdf5-4600-b432-e699602ec95f", 00:14:28.091 "strip_size_kb": 64, 00:14:28.091 "state": "online", 00:14:28.091 "raid_level": "raid0", 00:14:28.091 "superblock": true, 00:14:28.091 "num_base_bdevs": 3, 00:14:28.091 "num_base_bdevs_discovered": 3, 00:14:28.091 "num_base_bdevs_operational": 3, 00:14:28.091 "base_bdevs_list": [ 00:14:28.091 { 00:14:28.091 "name": "BaseBdev1", 00:14:28.091 "uuid": "3f584521-6891-5055-bbbe-4eafa77e149c", 00:14:28.091 "is_configured": true, 00:14:28.091 "data_offset": 2048, 00:14:28.091 "data_size": 63488 00:14:28.091 }, 00:14:28.091 { 00:14:28.091 "name": "BaseBdev2", 00:14:28.091 "uuid": "506062f5-5662-56e4-ba39-ae2c91ab3655", 00:14:28.091 "is_configured": true, 00:14:28.091 "data_offset": 2048, 00:14:28.091 "data_size": 63488 00:14:28.091 }, 00:14:28.091 { 00:14:28.091 "name": "BaseBdev3", 00:14:28.091 "uuid": "6013c30d-aaf8-5814-ae92-80faf1dd2b11", 00:14:28.091 "is_configured": true, 00:14:28.091 "data_offset": 2048, 00:14:28.091 "data_size": 63488 00:14:28.091 } 00:14:28.091 ] 00:14:28.091 }' 00:14:28.091 10:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.091 10:41:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.658 10:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:28.658 10:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:28.658 [2024-07-12 10:41:03.742821] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19055b0 00:14:29.595 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.854 10:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:30.112 10:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.112 "name": "raid_bdev1", 00:14:30.112 "uuid": "fa800a90-bdf5-4600-b432-e699602ec95f", 00:14:30.112 "strip_size_kb": 64, 00:14:30.112 "state": "online", 00:14:30.112 "raid_level": "raid0", 00:14:30.112 "superblock": true, 00:14:30.112 "num_base_bdevs": 3, 00:14:30.112 "num_base_bdevs_discovered": 3, 00:14:30.112 "num_base_bdevs_operational": 3, 00:14:30.112 "base_bdevs_list": [ 00:14:30.112 { 00:14:30.112 "name": "BaseBdev1", 00:14:30.112 "uuid": "3f584521-6891-5055-bbbe-4eafa77e149c", 00:14:30.112 "is_configured": true, 00:14:30.112 "data_offset": 2048, 00:14:30.112 "data_size": 63488 00:14:30.112 }, 00:14:30.112 { 00:14:30.112 "name": "BaseBdev2", 00:14:30.112 "uuid": "506062f5-5662-56e4-ba39-ae2c91ab3655", 00:14:30.112 "is_configured": true, 00:14:30.112 "data_offset": 2048, 00:14:30.112 "data_size": 63488 00:14:30.112 }, 00:14:30.112 { 00:14:30.112 "name": "BaseBdev3", 00:14:30.112 "uuid": "6013c30d-aaf8-5814-ae92-80faf1dd2b11", 00:14:30.112 "is_configured": true, 00:14:30.112 "data_offset": 2048, 00:14:30.112 "data_size": 63488 00:14:30.112 } 00:14:30.112 ] 00:14:30.112 }' 00:14:30.112 10:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.112 10:41:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:30.677 10:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:30.935 [2024-07-12 10:41:05.943681] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:30.935 [2024-07-12 10:41:05.943717] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:30.935 [2024-07-12 10:41:05.946886] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:30.935 [2024-07-12 10:41:05.946923] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:30.935 [2024-07-12 10:41:05.946958] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:30.935 [2024-07-12 10:41:05.946969] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ab7280 name raid_bdev1, state offline 00:14:30.935 0 00:14:30.935 10:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2047397 00:14:30.935 10:41:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2047397 ']' 00:14:30.935 10:41:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2047397 00:14:30.935 10:41:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:30.935 10:41:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:30.935 10:41:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2047397 00:14:30.935 10:41:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:30.935 10:41:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:30.935 10:41:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2047397' 00:14:30.935 killing process with pid 2047397 00:14:30.935 10:41:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2047397 00:14:30.935 [2024-07-12 10:41:06.011132] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:30.935 10:41:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2047397 00:14:30.936 [2024-07-12 10:41:06.031870] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.6ELrwYoJIT 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:14:31.194 00:14:31.194 real 0m6.920s 00:14:31.194 user 0m10.954s 00:14:31.194 sys 0m1.232s 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:31.194 10:41:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.194 ************************************ 00:14:31.194 END TEST raid_write_error_test 00:14:31.194 ************************************ 00:14:31.194 10:41:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:31.194 10:41:06 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:31.194 10:41:06 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:14:31.194 10:41:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:31.194 10:41:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:31.194 10:41:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:31.194 ************************************ 00:14:31.194 START TEST raid_state_function_test 00:14:31.194 ************************************ 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2048381 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2048381' 00:14:31.194 Process raid pid: 2048381 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2048381 /var/tmp/spdk-raid.sock 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2048381 ']' 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:31.194 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:31.194 10:41:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.453 [2024-07-12 10:41:06.398948] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:14:31.453 [2024-07-12 10:41:06.399012] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:31.453 [2024-07-12 10:41:06.529921] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:31.453 [2024-07-12 10:41:06.637998] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:31.711 [2024-07-12 10:41:06.711612] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:31.711 [2024-07-12 10:41:06.711647] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:32.276 10:41:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:32.276 10:41:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:32.276 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:32.534 [2024-07-12 10:41:07.723388] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:32.534 [2024-07-12 10:41:07.723432] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:32.534 [2024-07-12 10:41:07.723443] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:32.534 [2024-07-12 10:41:07.723454] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:32.534 [2024-07-12 10:41:07.723463] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:32.534 [2024-07-12 10:41:07.723474] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:32.793 10:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.051 10:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.051 "name": "Existed_Raid", 00:14:33.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.051 "strip_size_kb": 64, 00:14:33.051 "state": "configuring", 00:14:33.051 "raid_level": "concat", 00:14:33.051 "superblock": false, 00:14:33.051 "num_base_bdevs": 3, 00:14:33.051 "num_base_bdevs_discovered": 0, 00:14:33.051 "num_base_bdevs_operational": 3, 00:14:33.051 "base_bdevs_list": [ 00:14:33.051 { 00:14:33.051 "name": "BaseBdev1", 00:14:33.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.051 "is_configured": false, 00:14:33.051 "data_offset": 0, 00:14:33.051 "data_size": 0 00:14:33.051 }, 00:14:33.051 { 00:14:33.051 "name": "BaseBdev2", 00:14:33.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.051 "is_configured": false, 00:14:33.051 "data_offset": 0, 00:14:33.052 "data_size": 0 00:14:33.052 }, 00:14:33.052 { 00:14:33.052 "name": "BaseBdev3", 00:14:33.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.052 "is_configured": false, 00:14:33.052 "data_offset": 0, 00:14:33.052 "data_size": 0 00:14:33.052 } 00:14:33.052 ] 00:14:33.052 }' 00:14:33.052 10:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.052 10:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.986 10:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:33.986 [2024-07-12 10:41:09.090969] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:33.986 [2024-07-12 10:41:09.091001] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f73a80 name Existed_Raid, state configuring 00:14:33.986 10:41:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:34.245 [2024-07-12 10:41:09.331626] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:34.245 [2024-07-12 10:41:09.331656] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:34.245 [2024-07-12 10:41:09.331665] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:34.245 [2024-07-12 10:41:09.331676] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:34.245 [2024-07-12 10:41:09.331685] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:34.245 [2024-07-12 10:41:09.331695] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:34.245 10:41:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:34.504 [2024-07-12 10:41:09.590109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:34.504 BaseBdev1 00:14:34.504 10:41:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:34.504 10:41:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:34.504 10:41:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:34.504 10:41:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:34.504 10:41:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:34.504 10:41:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:34.504 10:41:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:34.762 10:41:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:35.060 [ 00:14:35.060 { 00:14:35.060 "name": "BaseBdev1", 00:14:35.060 "aliases": [ 00:14:35.060 "a844f751-0126-4158-a1d9-22ac7c8b71e2" 00:14:35.060 ], 00:14:35.060 "product_name": "Malloc disk", 00:14:35.060 "block_size": 512, 00:14:35.060 "num_blocks": 65536, 00:14:35.060 "uuid": "a844f751-0126-4158-a1d9-22ac7c8b71e2", 00:14:35.060 "assigned_rate_limits": { 00:14:35.060 "rw_ios_per_sec": 0, 00:14:35.060 "rw_mbytes_per_sec": 0, 00:14:35.060 "r_mbytes_per_sec": 0, 00:14:35.060 "w_mbytes_per_sec": 0 00:14:35.060 }, 00:14:35.060 "claimed": true, 00:14:35.060 "claim_type": "exclusive_write", 00:14:35.060 "zoned": false, 00:14:35.060 "supported_io_types": { 00:14:35.060 "read": true, 00:14:35.060 "write": true, 00:14:35.060 "unmap": true, 00:14:35.060 "flush": true, 00:14:35.060 "reset": true, 00:14:35.060 "nvme_admin": false, 00:14:35.060 "nvme_io": false, 00:14:35.060 "nvme_io_md": false, 00:14:35.060 "write_zeroes": true, 00:14:35.060 "zcopy": true, 00:14:35.060 "get_zone_info": false, 00:14:35.060 "zone_management": false, 00:14:35.060 "zone_append": false, 00:14:35.060 "compare": false, 00:14:35.060 "compare_and_write": false, 00:14:35.060 "abort": true, 00:14:35.060 "seek_hole": false, 00:14:35.060 "seek_data": false, 00:14:35.060 "copy": true, 00:14:35.060 "nvme_iov_md": false 00:14:35.060 }, 00:14:35.060 "memory_domains": [ 00:14:35.060 { 00:14:35.060 "dma_device_id": "system", 00:14:35.060 "dma_device_type": 1 00:14:35.060 }, 00:14:35.060 { 00:14:35.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.060 "dma_device_type": 2 00:14:35.060 } 00:14:35.060 ], 00:14:35.060 "driver_specific": {} 00:14:35.060 } 00:14:35.060 ] 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.060 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.319 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.319 "name": "Existed_Raid", 00:14:35.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.319 "strip_size_kb": 64, 00:14:35.319 "state": "configuring", 00:14:35.319 "raid_level": "concat", 00:14:35.319 "superblock": false, 00:14:35.319 "num_base_bdevs": 3, 00:14:35.319 "num_base_bdevs_discovered": 1, 00:14:35.319 "num_base_bdevs_operational": 3, 00:14:35.319 "base_bdevs_list": [ 00:14:35.319 { 00:14:35.319 "name": "BaseBdev1", 00:14:35.319 "uuid": "a844f751-0126-4158-a1d9-22ac7c8b71e2", 00:14:35.319 "is_configured": true, 00:14:35.319 "data_offset": 0, 00:14:35.319 "data_size": 65536 00:14:35.319 }, 00:14:35.319 { 00:14:35.319 "name": "BaseBdev2", 00:14:35.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.319 "is_configured": false, 00:14:35.319 "data_offset": 0, 00:14:35.319 "data_size": 0 00:14:35.319 }, 00:14:35.319 { 00:14:35.319 "name": "BaseBdev3", 00:14:35.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.319 "is_configured": false, 00:14:35.319 "data_offset": 0, 00:14:35.319 "data_size": 0 00:14:35.319 } 00:14:35.319 ] 00:14:35.319 }' 00:14:35.319 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.320 10:41:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.886 10:41:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:36.144 [2024-07-12 10:41:11.154266] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:36.144 [2024-07-12 10:41:11.154307] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f73310 name Existed_Raid, state configuring 00:14:36.144 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:36.402 [2024-07-12 10:41:11.398951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:36.402 [2024-07-12 10:41:11.400385] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:36.402 [2024-07-12 10:41:11.400418] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:36.402 [2024-07-12 10:41:11.400428] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:36.402 [2024-07-12 10:41:11.400439] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.402 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.660 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.660 "name": "Existed_Raid", 00:14:36.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.660 "strip_size_kb": 64, 00:14:36.660 "state": "configuring", 00:14:36.660 "raid_level": "concat", 00:14:36.660 "superblock": false, 00:14:36.660 "num_base_bdevs": 3, 00:14:36.660 "num_base_bdevs_discovered": 1, 00:14:36.660 "num_base_bdevs_operational": 3, 00:14:36.660 "base_bdevs_list": [ 00:14:36.660 { 00:14:36.660 "name": "BaseBdev1", 00:14:36.660 "uuid": "a844f751-0126-4158-a1d9-22ac7c8b71e2", 00:14:36.660 "is_configured": true, 00:14:36.660 "data_offset": 0, 00:14:36.660 "data_size": 65536 00:14:36.660 }, 00:14:36.660 { 00:14:36.660 "name": "BaseBdev2", 00:14:36.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.660 "is_configured": false, 00:14:36.660 "data_offset": 0, 00:14:36.660 "data_size": 0 00:14:36.660 }, 00:14:36.660 { 00:14:36.660 "name": "BaseBdev3", 00:14:36.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.660 "is_configured": false, 00:14:36.660 "data_offset": 0, 00:14:36.660 "data_size": 0 00:14:36.660 } 00:14:36.660 ] 00:14:36.660 }' 00:14:36.660 10:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.660 10:41:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.337 10:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:37.337 [2024-07-12 10:41:12.489225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:37.337 BaseBdev2 00:14:37.337 10:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:37.337 10:41:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:37.337 10:41:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:37.337 10:41:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:37.337 10:41:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:37.337 10:41:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:37.337 10:41:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:37.595 10:41:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:37.854 [ 00:14:37.854 { 00:14:37.854 "name": "BaseBdev2", 00:14:37.854 "aliases": [ 00:14:37.854 "c795fd4a-2335-4d44-b8db-f32889ea4d5a" 00:14:37.854 ], 00:14:37.854 "product_name": "Malloc disk", 00:14:37.854 "block_size": 512, 00:14:37.854 "num_blocks": 65536, 00:14:37.854 "uuid": "c795fd4a-2335-4d44-b8db-f32889ea4d5a", 00:14:37.854 "assigned_rate_limits": { 00:14:37.854 "rw_ios_per_sec": 0, 00:14:37.854 "rw_mbytes_per_sec": 0, 00:14:37.854 "r_mbytes_per_sec": 0, 00:14:37.854 "w_mbytes_per_sec": 0 00:14:37.854 }, 00:14:37.854 "claimed": true, 00:14:37.854 "claim_type": "exclusive_write", 00:14:37.854 "zoned": false, 00:14:37.854 "supported_io_types": { 00:14:37.854 "read": true, 00:14:37.854 "write": true, 00:14:37.854 "unmap": true, 00:14:37.854 "flush": true, 00:14:37.854 "reset": true, 00:14:37.854 "nvme_admin": false, 00:14:37.854 "nvme_io": false, 00:14:37.854 "nvme_io_md": false, 00:14:37.854 "write_zeroes": true, 00:14:37.854 "zcopy": true, 00:14:37.854 "get_zone_info": false, 00:14:37.854 "zone_management": false, 00:14:37.854 "zone_append": false, 00:14:37.854 "compare": false, 00:14:37.854 "compare_and_write": false, 00:14:37.854 "abort": true, 00:14:37.854 "seek_hole": false, 00:14:37.854 "seek_data": false, 00:14:37.854 "copy": true, 00:14:37.854 "nvme_iov_md": false 00:14:37.855 }, 00:14:37.855 "memory_domains": [ 00:14:37.855 { 00:14:37.855 "dma_device_id": "system", 00:14:37.855 "dma_device_type": 1 00:14:37.855 }, 00:14:37.855 { 00:14:37.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.855 "dma_device_type": 2 00:14:37.855 } 00:14:37.855 ], 00:14:37.855 "driver_specific": {} 00:14:37.855 } 00:14:37.855 ] 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.855 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.112 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.112 "name": "Existed_Raid", 00:14:38.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.112 "strip_size_kb": 64, 00:14:38.112 "state": "configuring", 00:14:38.112 "raid_level": "concat", 00:14:38.112 "superblock": false, 00:14:38.112 "num_base_bdevs": 3, 00:14:38.112 "num_base_bdevs_discovered": 2, 00:14:38.112 "num_base_bdevs_operational": 3, 00:14:38.112 "base_bdevs_list": [ 00:14:38.112 { 00:14:38.112 "name": "BaseBdev1", 00:14:38.112 "uuid": "a844f751-0126-4158-a1d9-22ac7c8b71e2", 00:14:38.112 "is_configured": true, 00:14:38.112 "data_offset": 0, 00:14:38.112 "data_size": 65536 00:14:38.112 }, 00:14:38.112 { 00:14:38.112 "name": "BaseBdev2", 00:14:38.112 "uuid": "c795fd4a-2335-4d44-b8db-f32889ea4d5a", 00:14:38.112 "is_configured": true, 00:14:38.112 "data_offset": 0, 00:14:38.112 "data_size": 65536 00:14:38.112 }, 00:14:38.112 { 00:14:38.112 "name": "BaseBdev3", 00:14:38.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.112 "is_configured": false, 00:14:38.112 "data_offset": 0, 00:14:38.112 "data_size": 0 00:14:38.112 } 00:14:38.112 ] 00:14:38.112 }' 00:14:38.112 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.112 10:41:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.692 10:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:38.949 [2024-07-12 10:41:14.088866] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:38.949 [2024-07-12 10:41:14.088902] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f74400 00:14:38.949 [2024-07-12 10:41:14.088911] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:38.949 [2024-07-12 10:41:14.089153] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f73ef0 00:14:38.949 [2024-07-12 10:41:14.089271] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f74400 00:14:38.949 [2024-07-12 10:41:14.089281] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f74400 00:14:38.949 [2024-07-12 10:41:14.089441] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:38.949 BaseBdev3 00:14:38.949 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:38.949 10:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:38.949 10:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:38.949 10:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:38.949 10:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:38.949 10:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:38.949 10:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.208 10:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:39.466 [ 00:14:39.466 { 00:14:39.466 "name": "BaseBdev3", 00:14:39.466 "aliases": [ 00:14:39.466 "aafaad0a-ac52-407e-b37f-b3cd096379a1" 00:14:39.466 ], 00:14:39.466 "product_name": "Malloc disk", 00:14:39.466 "block_size": 512, 00:14:39.466 "num_blocks": 65536, 00:14:39.466 "uuid": "aafaad0a-ac52-407e-b37f-b3cd096379a1", 00:14:39.466 "assigned_rate_limits": { 00:14:39.466 "rw_ios_per_sec": 0, 00:14:39.466 "rw_mbytes_per_sec": 0, 00:14:39.466 "r_mbytes_per_sec": 0, 00:14:39.466 "w_mbytes_per_sec": 0 00:14:39.466 }, 00:14:39.467 "claimed": true, 00:14:39.467 "claim_type": "exclusive_write", 00:14:39.467 "zoned": false, 00:14:39.467 "supported_io_types": { 00:14:39.467 "read": true, 00:14:39.467 "write": true, 00:14:39.467 "unmap": true, 00:14:39.467 "flush": true, 00:14:39.467 "reset": true, 00:14:39.467 "nvme_admin": false, 00:14:39.467 "nvme_io": false, 00:14:39.467 "nvme_io_md": false, 00:14:39.467 "write_zeroes": true, 00:14:39.467 "zcopy": true, 00:14:39.467 "get_zone_info": false, 00:14:39.467 "zone_management": false, 00:14:39.467 "zone_append": false, 00:14:39.467 "compare": false, 00:14:39.467 "compare_and_write": false, 00:14:39.467 "abort": true, 00:14:39.467 "seek_hole": false, 00:14:39.467 "seek_data": false, 00:14:39.467 "copy": true, 00:14:39.467 "nvme_iov_md": false 00:14:39.467 }, 00:14:39.467 "memory_domains": [ 00:14:39.467 { 00:14:39.467 "dma_device_id": "system", 00:14:39.467 "dma_device_type": 1 00:14:39.467 }, 00:14:39.467 { 00:14:39.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.467 "dma_device_type": 2 00:14:39.467 } 00:14:39.467 ], 00:14:39.467 "driver_specific": {} 00:14:39.467 } 00:14:39.467 ] 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.467 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.726 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.726 "name": "Existed_Raid", 00:14:39.726 "uuid": "08495080-845d-4de5-9cd5-50e92ee6ec30", 00:14:39.726 "strip_size_kb": 64, 00:14:39.726 "state": "online", 00:14:39.726 "raid_level": "concat", 00:14:39.726 "superblock": false, 00:14:39.726 "num_base_bdevs": 3, 00:14:39.726 "num_base_bdevs_discovered": 3, 00:14:39.726 "num_base_bdevs_operational": 3, 00:14:39.726 "base_bdevs_list": [ 00:14:39.726 { 00:14:39.726 "name": "BaseBdev1", 00:14:39.726 "uuid": "a844f751-0126-4158-a1d9-22ac7c8b71e2", 00:14:39.726 "is_configured": true, 00:14:39.726 "data_offset": 0, 00:14:39.726 "data_size": 65536 00:14:39.726 }, 00:14:39.726 { 00:14:39.726 "name": "BaseBdev2", 00:14:39.726 "uuid": "c795fd4a-2335-4d44-b8db-f32889ea4d5a", 00:14:39.726 "is_configured": true, 00:14:39.726 "data_offset": 0, 00:14:39.726 "data_size": 65536 00:14:39.726 }, 00:14:39.726 { 00:14:39.726 "name": "BaseBdev3", 00:14:39.726 "uuid": "aafaad0a-ac52-407e-b37f-b3cd096379a1", 00:14:39.726 "is_configured": true, 00:14:39.726 "data_offset": 0, 00:14:39.726 "data_size": 65536 00:14:39.726 } 00:14:39.726 ] 00:14:39.726 }' 00:14:39.726 10:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.726 10:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.295 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:40.295 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:40.295 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:40.295 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:40.295 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:40.295 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:40.295 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:40.295 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:40.554 [2024-07-12 10:41:15.581136] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:40.554 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:40.554 "name": "Existed_Raid", 00:14:40.554 "aliases": [ 00:14:40.554 "08495080-845d-4de5-9cd5-50e92ee6ec30" 00:14:40.554 ], 00:14:40.554 "product_name": "Raid Volume", 00:14:40.554 "block_size": 512, 00:14:40.554 "num_blocks": 196608, 00:14:40.554 "uuid": "08495080-845d-4de5-9cd5-50e92ee6ec30", 00:14:40.554 "assigned_rate_limits": { 00:14:40.554 "rw_ios_per_sec": 0, 00:14:40.554 "rw_mbytes_per_sec": 0, 00:14:40.554 "r_mbytes_per_sec": 0, 00:14:40.554 "w_mbytes_per_sec": 0 00:14:40.554 }, 00:14:40.554 "claimed": false, 00:14:40.554 "zoned": false, 00:14:40.554 "supported_io_types": { 00:14:40.554 "read": true, 00:14:40.554 "write": true, 00:14:40.554 "unmap": true, 00:14:40.554 "flush": true, 00:14:40.554 "reset": true, 00:14:40.554 "nvme_admin": false, 00:14:40.554 "nvme_io": false, 00:14:40.554 "nvme_io_md": false, 00:14:40.554 "write_zeroes": true, 00:14:40.554 "zcopy": false, 00:14:40.554 "get_zone_info": false, 00:14:40.554 "zone_management": false, 00:14:40.554 "zone_append": false, 00:14:40.554 "compare": false, 00:14:40.554 "compare_and_write": false, 00:14:40.554 "abort": false, 00:14:40.554 "seek_hole": false, 00:14:40.554 "seek_data": false, 00:14:40.554 "copy": false, 00:14:40.554 "nvme_iov_md": false 00:14:40.554 }, 00:14:40.554 "memory_domains": [ 00:14:40.554 { 00:14:40.554 "dma_device_id": "system", 00:14:40.554 "dma_device_type": 1 00:14:40.554 }, 00:14:40.554 { 00:14:40.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.554 "dma_device_type": 2 00:14:40.554 }, 00:14:40.554 { 00:14:40.554 "dma_device_id": "system", 00:14:40.554 "dma_device_type": 1 00:14:40.554 }, 00:14:40.554 { 00:14:40.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.554 "dma_device_type": 2 00:14:40.554 }, 00:14:40.554 { 00:14:40.554 "dma_device_id": "system", 00:14:40.554 "dma_device_type": 1 00:14:40.554 }, 00:14:40.554 { 00:14:40.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.554 "dma_device_type": 2 00:14:40.554 } 00:14:40.554 ], 00:14:40.554 "driver_specific": { 00:14:40.554 "raid": { 00:14:40.554 "uuid": "08495080-845d-4de5-9cd5-50e92ee6ec30", 00:14:40.554 "strip_size_kb": 64, 00:14:40.554 "state": "online", 00:14:40.554 "raid_level": "concat", 00:14:40.554 "superblock": false, 00:14:40.554 "num_base_bdevs": 3, 00:14:40.554 "num_base_bdevs_discovered": 3, 00:14:40.554 "num_base_bdevs_operational": 3, 00:14:40.554 "base_bdevs_list": [ 00:14:40.554 { 00:14:40.554 "name": "BaseBdev1", 00:14:40.554 "uuid": "a844f751-0126-4158-a1d9-22ac7c8b71e2", 00:14:40.554 "is_configured": true, 00:14:40.554 "data_offset": 0, 00:14:40.554 "data_size": 65536 00:14:40.554 }, 00:14:40.554 { 00:14:40.554 "name": "BaseBdev2", 00:14:40.554 "uuid": "c795fd4a-2335-4d44-b8db-f32889ea4d5a", 00:14:40.554 "is_configured": true, 00:14:40.554 "data_offset": 0, 00:14:40.554 "data_size": 65536 00:14:40.554 }, 00:14:40.554 { 00:14:40.554 "name": "BaseBdev3", 00:14:40.554 "uuid": "aafaad0a-ac52-407e-b37f-b3cd096379a1", 00:14:40.554 "is_configured": true, 00:14:40.554 "data_offset": 0, 00:14:40.554 "data_size": 65536 00:14:40.554 } 00:14:40.554 ] 00:14:40.554 } 00:14:40.554 } 00:14:40.554 }' 00:14:40.554 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:40.554 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:40.554 BaseBdev2 00:14:40.554 BaseBdev3' 00:14:40.554 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.554 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:40.554 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:40.813 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:40.813 "name": "BaseBdev1", 00:14:40.813 "aliases": [ 00:14:40.813 "a844f751-0126-4158-a1d9-22ac7c8b71e2" 00:14:40.813 ], 00:14:40.813 "product_name": "Malloc disk", 00:14:40.813 "block_size": 512, 00:14:40.813 "num_blocks": 65536, 00:14:40.813 "uuid": "a844f751-0126-4158-a1d9-22ac7c8b71e2", 00:14:40.813 "assigned_rate_limits": { 00:14:40.813 "rw_ios_per_sec": 0, 00:14:40.813 "rw_mbytes_per_sec": 0, 00:14:40.813 "r_mbytes_per_sec": 0, 00:14:40.813 "w_mbytes_per_sec": 0 00:14:40.813 }, 00:14:40.813 "claimed": true, 00:14:40.813 "claim_type": "exclusive_write", 00:14:40.813 "zoned": false, 00:14:40.813 "supported_io_types": { 00:14:40.813 "read": true, 00:14:40.813 "write": true, 00:14:40.813 "unmap": true, 00:14:40.813 "flush": true, 00:14:40.813 "reset": true, 00:14:40.813 "nvme_admin": false, 00:14:40.813 "nvme_io": false, 00:14:40.813 "nvme_io_md": false, 00:14:40.813 "write_zeroes": true, 00:14:40.813 "zcopy": true, 00:14:40.813 "get_zone_info": false, 00:14:40.813 "zone_management": false, 00:14:40.813 "zone_append": false, 00:14:40.813 "compare": false, 00:14:40.813 "compare_and_write": false, 00:14:40.813 "abort": true, 00:14:40.813 "seek_hole": false, 00:14:40.813 "seek_data": false, 00:14:40.813 "copy": true, 00:14:40.813 "nvme_iov_md": false 00:14:40.813 }, 00:14:40.813 "memory_domains": [ 00:14:40.813 { 00:14:40.813 "dma_device_id": "system", 00:14:40.813 "dma_device_type": 1 00:14:40.813 }, 00:14:40.813 { 00:14:40.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.813 "dma_device_type": 2 00:14:40.813 } 00:14:40.813 ], 00:14:40.813 "driver_specific": {} 00:14:40.813 }' 00:14:40.813 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.813 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.813 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:40.813 10:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:41.072 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.330 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.330 "name": "BaseBdev2", 00:14:41.330 "aliases": [ 00:14:41.330 "c795fd4a-2335-4d44-b8db-f32889ea4d5a" 00:14:41.330 ], 00:14:41.330 "product_name": "Malloc disk", 00:14:41.330 "block_size": 512, 00:14:41.330 "num_blocks": 65536, 00:14:41.330 "uuid": "c795fd4a-2335-4d44-b8db-f32889ea4d5a", 00:14:41.330 "assigned_rate_limits": { 00:14:41.330 "rw_ios_per_sec": 0, 00:14:41.330 "rw_mbytes_per_sec": 0, 00:14:41.330 "r_mbytes_per_sec": 0, 00:14:41.330 "w_mbytes_per_sec": 0 00:14:41.330 }, 00:14:41.330 "claimed": true, 00:14:41.330 "claim_type": "exclusive_write", 00:14:41.330 "zoned": false, 00:14:41.330 "supported_io_types": { 00:14:41.330 "read": true, 00:14:41.330 "write": true, 00:14:41.330 "unmap": true, 00:14:41.330 "flush": true, 00:14:41.331 "reset": true, 00:14:41.331 "nvme_admin": false, 00:14:41.331 "nvme_io": false, 00:14:41.331 "nvme_io_md": false, 00:14:41.331 "write_zeroes": true, 00:14:41.331 "zcopy": true, 00:14:41.331 "get_zone_info": false, 00:14:41.331 "zone_management": false, 00:14:41.331 "zone_append": false, 00:14:41.331 "compare": false, 00:14:41.331 "compare_and_write": false, 00:14:41.331 "abort": true, 00:14:41.331 "seek_hole": false, 00:14:41.331 "seek_data": false, 00:14:41.331 "copy": true, 00:14:41.331 "nvme_iov_md": false 00:14:41.331 }, 00:14:41.331 "memory_domains": [ 00:14:41.331 { 00:14:41.331 "dma_device_id": "system", 00:14:41.331 "dma_device_type": 1 00:14:41.331 }, 00:14:41.331 { 00:14:41.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.331 "dma_device_type": 2 00:14:41.331 } 00:14:41.331 ], 00:14:41.331 "driver_specific": {} 00:14:41.331 }' 00:14:41.331 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.589 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.589 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.589 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.589 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.589 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.589 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.589 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.589 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.589 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.848 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.848 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.848 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.848 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:41.848 10:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:42.107 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:42.107 "name": "BaseBdev3", 00:14:42.107 "aliases": [ 00:14:42.107 "aafaad0a-ac52-407e-b37f-b3cd096379a1" 00:14:42.107 ], 00:14:42.107 "product_name": "Malloc disk", 00:14:42.107 "block_size": 512, 00:14:42.107 "num_blocks": 65536, 00:14:42.107 "uuid": "aafaad0a-ac52-407e-b37f-b3cd096379a1", 00:14:42.107 "assigned_rate_limits": { 00:14:42.107 "rw_ios_per_sec": 0, 00:14:42.107 "rw_mbytes_per_sec": 0, 00:14:42.107 "r_mbytes_per_sec": 0, 00:14:42.107 "w_mbytes_per_sec": 0 00:14:42.107 }, 00:14:42.107 "claimed": true, 00:14:42.107 "claim_type": "exclusive_write", 00:14:42.107 "zoned": false, 00:14:42.107 "supported_io_types": { 00:14:42.107 "read": true, 00:14:42.107 "write": true, 00:14:42.107 "unmap": true, 00:14:42.107 "flush": true, 00:14:42.107 "reset": true, 00:14:42.107 "nvme_admin": false, 00:14:42.107 "nvme_io": false, 00:14:42.107 "nvme_io_md": false, 00:14:42.107 "write_zeroes": true, 00:14:42.107 "zcopy": true, 00:14:42.107 "get_zone_info": false, 00:14:42.107 "zone_management": false, 00:14:42.107 "zone_append": false, 00:14:42.107 "compare": false, 00:14:42.107 "compare_and_write": false, 00:14:42.107 "abort": true, 00:14:42.107 "seek_hole": false, 00:14:42.107 "seek_data": false, 00:14:42.107 "copy": true, 00:14:42.107 "nvme_iov_md": false 00:14:42.107 }, 00:14:42.107 "memory_domains": [ 00:14:42.107 { 00:14:42.107 "dma_device_id": "system", 00:14:42.107 "dma_device_type": 1 00:14:42.107 }, 00:14:42.107 { 00:14:42.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.107 "dma_device_type": 2 00:14:42.107 } 00:14:42.107 ], 00:14:42.107 "driver_specific": {} 00:14:42.107 }' 00:14:42.107 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.107 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.107 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.107 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.107 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.107 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.107 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.366 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.366 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:42.366 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.366 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.366 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.366 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:42.625 [2024-07-12 10:41:17.602272] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:42.625 [2024-07-12 10:41:17.602297] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:42.625 [2024-07-12 10:41:17.602337] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.625 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.884 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.884 "name": "Existed_Raid", 00:14:42.884 "uuid": "08495080-845d-4de5-9cd5-50e92ee6ec30", 00:14:42.884 "strip_size_kb": 64, 00:14:42.884 "state": "offline", 00:14:42.884 "raid_level": "concat", 00:14:42.884 "superblock": false, 00:14:42.884 "num_base_bdevs": 3, 00:14:42.884 "num_base_bdevs_discovered": 2, 00:14:42.884 "num_base_bdevs_operational": 2, 00:14:42.884 "base_bdevs_list": [ 00:14:42.884 { 00:14:42.884 "name": null, 00:14:42.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.884 "is_configured": false, 00:14:42.884 "data_offset": 0, 00:14:42.884 "data_size": 65536 00:14:42.884 }, 00:14:42.884 { 00:14:42.884 "name": "BaseBdev2", 00:14:42.884 "uuid": "c795fd4a-2335-4d44-b8db-f32889ea4d5a", 00:14:42.884 "is_configured": true, 00:14:42.884 "data_offset": 0, 00:14:42.884 "data_size": 65536 00:14:42.884 }, 00:14:42.884 { 00:14:42.884 "name": "BaseBdev3", 00:14:42.884 "uuid": "aafaad0a-ac52-407e-b37f-b3cd096379a1", 00:14:42.884 "is_configured": true, 00:14:42.884 "data_offset": 0, 00:14:42.884 "data_size": 65536 00:14:42.884 } 00:14:42.884 ] 00:14:42.884 }' 00:14:42.884 10:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.884 10:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.451 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:43.451 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.451 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.451 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:43.710 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:43.710 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:43.710 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:43.969 [2024-07-12 10:41:18.930838] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:43.969 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:43.969 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.969 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.969 10:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:44.228 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:44.228 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:44.228 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:44.487 [2024-07-12 10:41:19.424599] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:44.487 [2024-07-12 10:41:19.424644] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f74400 name Existed_Raid, state offline 00:14:44.487 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:44.487 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:44.487 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.487 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:44.746 BaseBdev2 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:44.746 10:41:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.005 10:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:45.264 [ 00:14:45.264 { 00:14:45.264 "name": "BaseBdev2", 00:14:45.264 "aliases": [ 00:14:45.264 "853043df-6978-485e-b8e7-dce1a8e2f764" 00:14:45.264 ], 00:14:45.264 "product_name": "Malloc disk", 00:14:45.264 "block_size": 512, 00:14:45.264 "num_blocks": 65536, 00:14:45.264 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:45.264 "assigned_rate_limits": { 00:14:45.264 "rw_ios_per_sec": 0, 00:14:45.264 "rw_mbytes_per_sec": 0, 00:14:45.264 "r_mbytes_per_sec": 0, 00:14:45.264 "w_mbytes_per_sec": 0 00:14:45.264 }, 00:14:45.264 "claimed": false, 00:14:45.264 "zoned": false, 00:14:45.264 "supported_io_types": { 00:14:45.264 "read": true, 00:14:45.264 "write": true, 00:14:45.264 "unmap": true, 00:14:45.264 "flush": true, 00:14:45.264 "reset": true, 00:14:45.264 "nvme_admin": false, 00:14:45.264 "nvme_io": false, 00:14:45.264 "nvme_io_md": false, 00:14:45.264 "write_zeroes": true, 00:14:45.264 "zcopy": true, 00:14:45.264 "get_zone_info": false, 00:14:45.264 "zone_management": false, 00:14:45.264 "zone_append": false, 00:14:45.264 "compare": false, 00:14:45.264 "compare_and_write": false, 00:14:45.264 "abort": true, 00:14:45.264 "seek_hole": false, 00:14:45.264 "seek_data": false, 00:14:45.264 "copy": true, 00:14:45.264 "nvme_iov_md": false 00:14:45.264 }, 00:14:45.264 "memory_domains": [ 00:14:45.264 { 00:14:45.264 "dma_device_id": "system", 00:14:45.264 "dma_device_type": 1 00:14:45.264 }, 00:14:45.264 { 00:14:45.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.264 "dma_device_type": 2 00:14:45.264 } 00:14:45.264 ], 00:14:45.264 "driver_specific": {} 00:14:45.264 } 00:14:45.264 ] 00:14:45.264 10:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:45.264 10:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:45.264 10:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:45.264 10:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:45.524 BaseBdev3 00:14:45.524 10:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:45.524 10:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:45.524 10:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:45.524 10:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:45.524 10:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:45.524 10:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:45.524 10:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.783 10:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:46.042 [ 00:14:46.042 { 00:14:46.042 "name": "BaseBdev3", 00:14:46.042 "aliases": [ 00:14:46.042 "9b02dc69-5d98-4983-96bf-4b98190b83cc" 00:14:46.042 ], 00:14:46.042 "product_name": "Malloc disk", 00:14:46.042 "block_size": 512, 00:14:46.042 "num_blocks": 65536, 00:14:46.042 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:46.042 "assigned_rate_limits": { 00:14:46.042 "rw_ios_per_sec": 0, 00:14:46.042 "rw_mbytes_per_sec": 0, 00:14:46.042 "r_mbytes_per_sec": 0, 00:14:46.042 "w_mbytes_per_sec": 0 00:14:46.042 }, 00:14:46.042 "claimed": false, 00:14:46.042 "zoned": false, 00:14:46.042 "supported_io_types": { 00:14:46.042 "read": true, 00:14:46.042 "write": true, 00:14:46.042 "unmap": true, 00:14:46.042 "flush": true, 00:14:46.042 "reset": true, 00:14:46.042 "nvme_admin": false, 00:14:46.042 "nvme_io": false, 00:14:46.042 "nvme_io_md": false, 00:14:46.042 "write_zeroes": true, 00:14:46.042 "zcopy": true, 00:14:46.042 "get_zone_info": false, 00:14:46.042 "zone_management": false, 00:14:46.042 "zone_append": false, 00:14:46.042 "compare": false, 00:14:46.042 "compare_and_write": false, 00:14:46.042 "abort": true, 00:14:46.042 "seek_hole": false, 00:14:46.042 "seek_data": false, 00:14:46.042 "copy": true, 00:14:46.042 "nvme_iov_md": false 00:14:46.042 }, 00:14:46.042 "memory_domains": [ 00:14:46.042 { 00:14:46.042 "dma_device_id": "system", 00:14:46.042 "dma_device_type": 1 00:14:46.042 }, 00:14:46.042 { 00:14:46.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.042 "dma_device_type": 2 00:14:46.042 } 00:14:46.042 ], 00:14:46.042 "driver_specific": {} 00:14:46.042 } 00:14:46.042 ] 00:14:46.042 10:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:46.042 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:46.042 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:46.043 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:46.302 [2024-07-12 10:41:21.387759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:46.302 [2024-07-12 10:41:21.387801] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:46.302 [2024-07-12 10:41:21.387820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:46.302 [2024-07-12 10:41:21.389135] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.302 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.561 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.561 "name": "Existed_Raid", 00:14:46.561 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.561 "strip_size_kb": 64, 00:14:46.561 "state": "configuring", 00:14:46.561 "raid_level": "concat", 00:14:46.561 "superblock": false, 00:14:46.561 "num_base_bdevs": 3, 00:14:46.561 "num_base_bdevs_discovered": 2, 00:14:46.561 "num_base_bdevs_operational": 3, 00:14:46.561 "base_bdevs_list": [ 00:14:46.561 { 00:14:46.561 "name": "BaseBdev1", 00:14:46.561 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.561 "is_configured": false, 00:14:46.561 "data_offset": 0, 00:14:46.561 "data_size": 0 00:14:46.561 }, 00:14:46.561 { 00:14:46.561 "name": "BaseBdev2", 00:14:46.561 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:46.561 "is_configured": true, 00:14:46.561 "data_offset": 0, 00:14:46.561 "data_size": 65536 00:14:46.561 }, 00:14:46.561 { 00:14:46.561 "name": "BaseBdev3", 00:14:46.561 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:46.561 "is_configured": true, 00:14:46.561 "data_offset": 0, 00:14:46.561 "data_size": 65536 00:14:46.561 } 00:14:46.561 ] 00:14:46.561 }' 00:14:46.561 10:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.561 10:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.128 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:47.387 [2024-07-12 10:41:22.446544] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.387 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.646 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.646 "name": "Existed_Raid", 00:14:47.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.646 "strip_size_kb": 64, 00:14:47.646 "state": "configuring", 00:14:47.646 "raid_level": "concat", 00:14:47.646 "superblock": false, 00:14:47.646 "num_base_bdevs": 3, 00:14:47.646 "num_base_bdevs_discovered": 1, 00:14:47.646 "num_base_bdevs_operational": 3, 00:14:47.646 "base_bdevs_list": [ 00:14:47.646 { 00:14:47.646 "name": "BaseBdev1", 00:14:47.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.646 "is_configured": false, 00:14:47.646 "data_offset": 0, 00:14:47.646 "data_size": 0 00:14:47.646 }, 00:14:47.646 { 00:14:47.646 "name": null, 00:14:47.646 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:47.646 "is_configured": false, 00:14:47.646 "data_offset": 0, 00:14:47.646 "data_size": 65536 00:14:47.646 }, 00:14:47.646 { 00:14:47.646 "name": "BaseBdev3", 00:14:47.646 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:47.646 "is_configured": true, 00:14:47.646 "data_offset": 0, 00:14:47.646 "data_size": 65536 00:14:47.646 } 00:14:47.646 ] 00:14:47.646 }' 00:14:47.646 10:41:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.646 10:41:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.213 10:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.213 10:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:48.471 10:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:48.471 10:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:48.730 [2024-07-12 10:41:23.798673] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:48.730 BaseBdev1 00:14:48.730 10:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:48.730 10:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:48.730 10:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:48.730 10:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:48.730 10:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:48.730 10:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:48.730 10:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:48.989 10:41:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:49.248 [ 00:14:49.248 { 00:14:49.248 "name": "BaseBdev1", 00:14:49.248 "aliases": [ 00:14:49.248 "13121472-9e05-4dd9-b8b4-2078389128e2" 00:14:49.248 ], 00:14:49.248 "product_name": "Malloc disk", 00:14:49.248 "block_size": 512, 00:14:49.248 "num_blocks": 65536, 00:14:49.248 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:49.248 "assigned_rate_limits": { 00:14:49.248 "rw_ios_per_sec": 0, 00:14:49.248 "rw_mbytes_per_sec": 0, 00:14:49.248 "r_mbytes_per_sec": 0, 00:14:49.248 "w_mbytes_per_sec": 0 00:14:49.248 }, 00:14:49.248 "claimed": true, 00:14:49.248 "claim_type": "exclusive_write", 00:14:49.248 "zoned": false, 00:14:49.248 "supported_io_types": { 00:14:49.248 "read": true, 00:14:49.248 "write": true, 00:14:49.248 "unmap": true, 00:14:49.248 "flush": true, 00:14:49.248 "reset": true, 00:14:49.248 "nvme_admin": false, 00:14:49.248 "nvme_io": false, 00:14:49.248 "nvme_io_md": false, 00:14:49.248 "write_zeroes": true, 00:14:49.248 "zcopy": true, 00:14:49.248 "get_zone_info": false, 00:14:49.248 "zone_management": false, 00:14:49.248 "zone_append": false, 00:14:49.248 "compare": false, 00:14:49.248 "compare_and_write": false, 00:14:49.248 "abort": true, 00:14:49.248 "seek_hole": false, 00:14:49.248 "seek_data": false, 00:14:49.248 "copy": true, 00:14:49.248 "nvme_iov_md": false 00:14:49.248 }, 00:14:49.248 "memory_domains": [ 00:14:49.248 { 00:14:49.248 "dma_device_id": "system", 00:14:49.248 "dma_device_type": 1 00:14:49.248 }, 00:14:49.248 { 00:14:49.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.248 "dma_device_type": 2 00:14:49.248 } 00:14:49.248 ], 00:14:49.248 "driver_specific": {} 00:14:49.248 } 00:14:49.248 ] 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.248 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.507 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.507 "name": "Existed_Raid", 00:14:49.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.507 "strip_size_kb": 64, 00:14:49.507 "state": "configuring", 00:14:49.507 "raid_level": "concat", 00:14:49.507 "superblock": false, 00:14:49.507 "num_base_bdevs": 3, 00:14:49.507 "num_base_bdevs_discovered": 2, 00:14:49.507 "num_base_bdevs_operational": 3, 00:14:49.507 "base_bdevs_list": [ 00:14:49.507 { 00:14:49.507 "name": "BaseBdev1", 00:14:49.507 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:49.507 "is_configured": true, 00:14:49.507 "data_offset": 0, 00:14:49.507 "data_size": 65536 00:14:49.507 }, 00:14:49.507 { 00:14:49.507 "name": null, 00:14:49.507 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:49.507 "is_configured": false, 00:14:49.507 "data_offset": 0, 00:14:49.507 "data_size": 65536 00:14:49.507 }, 00:14:49.507 { 00:14:49.507 "name": "BaseBdev3", 00:14:49.507 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:49.507 "is_configured": true, 00:14:49.507 "data_offset": 0, 00:14:49.507 "data_size": 65536 00:14:49.507 } 00:14:49.507 ] 00:14:49.507 }' 00:14:49.507 10:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.507 10:41:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.074 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.074 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:50.332 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:50.332 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:50.589 [2024-07-12 10:41:25.599515] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:50.589 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:50.589 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.590 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.847 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.847 "name": "Existed_Raid", 00:14:50.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.847 "strip_size_kb": 64, 00:14:50.847 "state": "configuring", 00:14:50.847 "raid_level": "concat", 00:14:50.847 "superblock": false, 00:14:50.847 "num_base_bdevs": 3, 00:14:50.847 "num_base_bdevs_discovered": 1, 00:14:50.847 "num_base_bdevs_operational": 3, 00:14:50.847 "base_bdevs_list": [ 00:14:50.847 { 00:14:50.847 "name": "BaseBdev1", 00:14:50.847 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:50.847 "is_configured": true, 00:14:50.847 "data_offset": 0, 00:14:50.847 "data_size": 65536 00:14:50.847 }, 00:14:50.847 { 00:14:50.848 "name": null, 00:14:50.848 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:50.848 "is_configured": false, 00:14:50.848 "data_offset": 0, 00:14:50.848 "data_size": 65536 00:14:50.848 }, 00:14:50.848 { 00:14:50.848 "name": null, 00:14:50.848 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:50.848 "is_configured": false, 00:14:50.848 "data_offset": 0, 00:14:50.848 "data_size": 65536 00:14:50.848 } 00:14:50.848 ] 00:14:50.848 }' 00:14:50.848 10:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.848 10:41:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.413 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.413 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:51.671 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:51.671 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:51.930 [2024-07-12 10:41:26.919053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.930 10:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.189 10:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.189 "name": "Existed_Raid", 00:14:52.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.189 "strip_size_kb": 64, 00:14:52.189 "state": "configuring", 00:14:52.189 "raid_level": "concat", 00:14:52.189 "superblock": false, 00:14:52.189 "num_base_bdevs": 3, 00:14:52.189 "num_base_bdevs_discovered": 2, 00:14:52.189 "num_base_bdevs_operational": 3, 00:14:52.189 "base_bdevs_list": [ 00:14:52.189 { 00:14:52.189 "name": "BaseBdev1", 00:14:52.189 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:52.189 "is_configured": true, 00:14:52.189 "data_offset": 0, 00:14:52.189 "data_size": 65536 00:14:52.189 }, 00:14:52.189 { 00:14:52.189 "name": null, 00:14:52.189 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:52.189 "is_configured": false, 00:14:52.189 "data_offset": 0, 00:14:52.189 "data_size": 65536 00:14:52.189 }, 00:14:52.189 { 00:14:52.189 "name": "BaseBdev3", 00:14:52.189 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:52.189 "is_configured": true, 00:14:52.189 "data_offset": 0, 00:14:52.189 "data_size": 65536 00:14:52.189 } 00:14:52.189 ] 00:14:52.189 }' 00:14:52.189 10:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.189 10:41:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.754 10:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.754 10:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:53.012 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:53.012 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:53.268 [2024-07-12 10:41:28.250636] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.268 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.526 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.526 "name": "Existed_Raid", 00:14:53.526 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.526 "strip_size_kb": 64, 00:14:53.526 "state": "configuring", 00:14:53.526 "raid_level": "concat", 00:14:53.526 "superblock": false, 00:14:53.526 "num_base_bdevs": 3, 00:14:53.526 "num_base_bdevs_discovered": 1, 00:14:53.526 "num_base_bdevs_operational": 3, 00:14:53.526 "base_bdevs_list": [ 00:14:53.526 { 00:14:53.526 "name": null, 00:14:53.526 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:53.526 "is_configured": false, 00:14:53.526 "data_offset": 0, 00:14:53.526 "data_size": 65536 00:14:53.526 }, 00:14:53.526 { 00:14:53.526 "name": null, 00:14:53.526 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:53.526 "is_configured": false, 00:14:53.526 "data_offset": 0, 00:14:53.526 "data_size": 65536 00:14:53.526 }, 00:14:53.526 { 00:14:53.526 "name": "BaseBdev3", 00:14:53.526 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:53.526 "is_configured": true, 00:14:53.526 "data_offset": 0, 00:14:53.526 "data_size": 65536 00:14:53.526 } 00:14:53.526 ] 00:14:53.526 }' 00:14:53.526 10:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.526 10:41:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.458 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.458 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:54.458 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:54.458 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:54.716 [2024-07-12 10:41:29.817552] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.716 10:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.975 10:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.975 "name": "Existed_Raid", 00:14:54.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.975 "strip_size_kb": 64, 00:14:54.975 "state": "configuring", 00:14:54.975 "raid_level": "concat", 00:14:54.975 "superblock": false, 00:14:54.975 "num_base_bdevs": 3, 00:14:54.975 "num_base_bdevs_discovered": 2, 00:14:54.975 "num_base_bdevs_operational": 3, 00:14:54.975 "base_bdevs_list": [ 00:14:54.975 { 00:14:54.975 "name": null, 00:14:54.975 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:54.975 "is_configured": false, 00:14:54.975 "data_offset": 0, 00:14:54.975 "data_size": 65536 00:14:54.975 }, 00:14:54.975 { 00:14:54.975 "name": "BaseBdev2", 00:14:54.975 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:54.975 "is_configured": true, 00:14:54.975 "data_offset": 0, 00:14:54.975 "data_size": 65536 00:14:54.975 }, 00:14:54.975 { 00:14:54.975 "name": "BaseBdev3", 00:14:54.975 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:54.975 "is_configured": true, 00:14:54.975 "data_offset": 0, 00:14:54.975 "data_size": 65536 00:14:54.975 } 00:14:54.975 ] 00:14:54.975 }' 00:14:54.975 10:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.975 10:41:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.542 10:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.542 10:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:55.801 10:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:55.801 10:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.801 10:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:56.059 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 13121472-9e05-4dd9-b8b4-2078389128e2 00:14:56.317 [2024-07-12 10:41:31.341066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:56.317 [2024-07-12 10:41:31.341102] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f72450 00:14:56.317 [2024-07-12 10:41:31.341111] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:56.317 [2024-07-12 10:41:31.341296] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f73ed0 00:14:56.317 [2024-07-12 10:41:31.341406] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f72450 00:14:56.317 [2024-07-12 10:41:31.341416] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f72450 00:14:56.317 [2024-07-12 10:41:31.341594] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:56.317 NewBaseBdev 00:14:56.317 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:56.317 10:41:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:56.317 10:41:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:56.317 10:41:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:56.317 10:41:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:56.317 10:41:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:56.317 10:41:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:56.575 [ 00:14:56.575 { 00:14:56.575 "name": "NewBaseBdev", 00:14:56.575 "aliases": [ 00:14:56.575 "13121472-9e05-4dd9-b8b4-2078389128e2" 00:14:56.575 ], 00:14:56.575 "product_name": "Malloc disk", 00:14:56.575 "block_size": 512, 00:14:56.575 "num_blocks": 65536, 00:14:56.575 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:56.575 "assigned_rate_limits": { 00:14:56.575 "rw_ios_per_sec": 0, 00:14:56.575 "rw_mbytes_per_sec": 0, 00:14:56.575 "r_mbytes_per_sec": 0, 00:14:56.575 "w_mbytes_per_sec": 0 00:14:56.575 }, 00:14:56.575 "claimed": true, 00:14:56.575 "claim_type": "exclusive_write", 00:14:56.575 "zoned": false, 00:14:56.575 "supported_io_types": { 00:14:56.575 "read": true, 00:14:56.575 "write": true, 00:14:56.575 "unmap": true, 00:14:56.575 "flush": true, 00:14:56.575 "reset": true, 00:14:56.575 "nvme_admin": false, 00:14:56.575 "nvme_io": false, 00:14:56.575 "nvme_io_md": false, 00:14:56.575 "write_zeroes": true, 00:14:56.575 "zcopy": true, 00:14:56.575 "get_zone_info": false, 00:14:56.575 "zone_management": false, 00:14:56.575 "zone_append": false, 00:14:56.575 "compare": false, 00:14:56.575 "compare_and_write": false, 00:14:56.575 "abort": true, 00:14:56.575 "seek_hole": false, 00:14:56.575 "seek_data": false, 00:14:56.575 "copy": true, 00:14:56.575 "nvme_iov_md": false 00:14:56.575 }, 00:14:56.575 "memory_domains": [ 00:14:56.575 { 00:14:56.575 "dma_device_id": "system", 00:14:56.575 "dma_device_type": 1 00:14:56.575 }, 00:14:56.575 { 00:14:56.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.575 "dma_device_type": 2 00:14:56.575 } 00:14:56.575 ], 00:14:56.575 "driver_specific": {} 00:14:56.575 } 00:14:56.575 ] 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.575 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.834 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.834 "name": "Existed_Raid", 00:14:56.834 "uuid": "e67fabd9-80bc-4937-9d6c-7309f6108635", 00:14:56.834 "strip_size_kb": 64, 00:14:56.834 "state": "online", 00:14:56.834 "raid_level": "concat", 00:14:56.834 "superblock": false, 00:14:56.834 "num_base_bdevs": 3, 00:14:56.834 "num_base_bdevs_discovered": 3, 00:14:56.834 "num_base_bdevs_operational": 3, 00:14:56.834 "base_bdevs_list": [ 00:14:56.834 { 00:14:56.834 "name": "NewBaseBdev", 00:14:56.834 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:56.834 "is_configured": true, 00:14:56.834 "data_offset": 0, 00:14:56.834 "data_size": 65536 00:14:56.834 }, 00:14:56.834 { 00:14:56.834 "name": "BaseBdev2", 00:14:56.834 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:56.834 "is_configured": true, 00:14:56.834 "data_offset": 0, 00:14:56.834 "data_size": 65536 00:14:56.834 }, 00:14:56.834 { 00:14:56.834 "name": "BaseBdev3", 00:14:56.834 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:56.834 "is_configured": true, 00:14:56.834 "data_offset": 0, 00:14:56.834 "data_size": 65536 00:14:56.834 } 00:14:56.834 ] 00:14:56.834 }' 00:14:56.835 10:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.835 10:41:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.400 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:57.400 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:57.400 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:57.400 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:57.400 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:57.400 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:57.658 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:57.658 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:57.658 [2024-07-12 10:41:32.821443] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:57.658 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:57.658 "name": "Existed_Raid", 00:14:57.658 "aliases": [ 00:14:57.658 "e67fabd9-80bc-4937-9d6c-7309f6108635" 00:14:57.658 ], 00:14:57.658 "product_name": "Raid Volume", 00:14:57.658 "block_size": 512, 00:14:57.658 "num_blocks": 196608, 00:14:57.658 "uuid": "e67fabd9-80bc-4937-9d6c-7309f6108635", 00:14:57.658 "assigned_rate_limits": { 00:14:57.658 "rw_ios_per_sec": 0, 00:14:57.658 "rw_mbytes_per_sec": 0, 00:14:57.658 "r_mbytes_per_sec": 0, 00:14:57.658 "w_mbytes_per_sec": 0 00:14:57.658 }, 00:14:57.658 "claimed": false, 00:14:57.658 "zoned": false, 00:14:57.658 "supported_io_types": { 00:14:57.658 "read": true, 00:14:57.658 "write": true, 00:14:57.658 "unmap": true, 00:14:57.658 "flush": true, 00:14:57.658 "reset": true, 00:14:57.658 "nvme_admin": false, 00:14:57.658 "nvme_io": false, 00:14:57.658 "nvme_io_md": false, 00:14:57.658 "write_zeroes": true, 00:14:57.658 "zcopy": false, 00:14:57.658 "get_zone_info": false, 00:14:57.658 "zone_management": false, 00:14:57.658 "zone_append": false, 00:14:57.658 "compare": false, 00:14:57.658 "compare_and_write": false, 00:14:57.658 "abort": false, 00:14:57.658 "seek_hole": false, 00:14:57.658 "seek_data": false, 00:14:57.658 "copy": false, 00:14:57.658 "nvme_iov_md": false 00:14:57.658 }, 00:14:57.658 "memory_domains": [ 00:14:57.658 { 00:14:57.658 "dma_device_id": "system", 00:14:57.658 "dma_device_type": 1 00:14:57.658 }, 00:14:57.658 { 00:14:57.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.658 "dma_device_type": 2 00:14:57.658 }, 00:14:57.658 { 00:14:57.658 "dma_device_id": "system", 00:14:57.658 "dma_device_type": 1 00:14:57.658 }, 00:14:57.658 { 00:14:57.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.658 "dma_device_type": 2 00:14:57.658 }, 00:14:57.658 { 00:14:57.658 "dma_device_id": "system", 00:14:57.658 "dma_device_type": 1 00:14:57.658 }, 00:14:57.658 { 00:14:57.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.658 "dma_device_type": 2 00:14:57.658 } 00:14:57.658 ], 00:14:57.658 "driver_specific": { 00:14:57.658 "raid": { 00:14:57.658 "uuid": "e67fabd9-80bc-4937-9d6c-7309f6108635", 00:14:57.658 "strip_size_kb": 64, 00:14:57.658 "state": "online", 00:14:57.658 "raid_level": "concat", 00:14:57.658 "superblock": false, 00:14:57.658 "num_base_bdevs": 3, 00:14:57.658 "num_base_bdevs_discovered": 3, 00:14:57.658 "num_base_bdevs_operational": 3, 00:14:57.658 "base_bdevs_list": [ 00:14:57.658 { 00:14:57.658 "name": "NewBaseBdev", 00:14:57.658 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:57.658 "is_configured": true, 00:14:57.658 "data_offset": 0, 00:14:57.658 "data_size": 65536 00:14:57.658 }, 00:14:57.658 { 00:14:57.658 "name": "BaseBdev2", 00:14:57.658 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:57.658 "is_configured": true, 00:14:57.658 "data_offset": 0, 00:14:57.658 "data_size": 65536 00:14:57.658 }, 00:14:57.658 { 00:14:57.658 "name": "BaseBdev3", 00:14:57.658 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:57.658 "is_configured": true, 00:14:57.658 "data_offset": 0, 00:14:57.659 "data_size": 65536 00:14:57.659 } 00:14:57.659 ] 00:14:57.659 } 00:14:57.659 } 00:14:57.659 }' 00:14:57.659 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:57.917 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:57.917 BaseBdev2 00:14:57.917 BaseBdev3' 00:14:57.917 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:57.917 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:57.917 10:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.174 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.174 "name": "NewBaseBdev", 00:14:58.174 "aliases": [ 00:14:58.174 "13121472-9e05-4dd9-b8b4-2078389128e2" 00:14:58.174 ], 00:14:58.174 "product_name": "Malloc disk", 00:14:58.174 "block_size": 512, 00:14:58.174 "num_blocks": 65536, 00:14:58.174 "uuid": "13121472-9e05-4dd9-b8b4-2078389128e2", 00:14:58.174 "assigned_rate_limits": { 00:14:58.174 "rw_ios_per_sec": 0, 00:14:58.174 "rw_mbytes_per_sec": 0, 00:14:58.174 "r_mbytes_per_sec": 0, 00:14:58.174 "w_mbytes_per_sec": 0 00:14:58.174 }, 00:14:58.174 "claimed": true, 00:14:58.174 "claim_type": "exclusive_write", 00:14:58.174 "zoned": false, 00:14:58.174 "supported_io_types": { 00:14:58.174 "read": true, 00:14:58.174 "write": true, 00:14:58.174 "unmap": true, 00:14:58.174 "flush": true, 00:14:58.174 "reset": true, 00:14:58.174 "nvme_admin": false, 00:14:58.174 "nvme_io": false, 00:14:58.174 "nvme_io_md": false, 00:14:58.174 "write_zeroes": true, 00:14:58.174 "zcopy": true, 00:14:58.174 "get_zone_info": false, 00:14:58.174 "zone_management": false, 00:14:58.174 "zone_append": false, 00:14:58.174 "compare": false, 00:14:58.174 "compare_and_write": false, 00:14:58.174 "abort": true, 00:14:58.174 "seek_hole": false, 00:14:58.174 "seek_data": false, 00:14:58.174 "copy": true, 00:14:58.174 "nvme_iov_md": false 00:14:58.174 }, 00:14:58.174 "memory_domains": [ 00:14:58.174 { 00:14:58.174 "dma_device_id": "system", 00:14:58.174 "dma_device_type": 1 00:14:58.174 }, 00:14:58.174 { 00:14:58.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.174 "dma_device_type": 2 00:14:58.174 } 00:14:58.174 ], 00:14:58.174 "driver_specific": {} 00:14:58.174 }' 00:14:58.174 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.174 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.174 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.174 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.174 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.174 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.174 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.174 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.432 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.432 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.432 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.432 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.432 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.432 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:58.432 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.690 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.690 "name": "BaseBdev2", 00:14:58.690 "aliases": [ 00:14:58.690 "853043df-6978-485e-b8e7-dce1a8e2f764" 00:14:58.690 ], 00:14:58.690 "product_name": "Malloc disk", 00:14:58.690 "block_size": 512, 00:14:58.690 "num_blocks": 65536, 00:14:58.690 "uuid": "853043df-6978-485e-b8e7-dce1a8e2f764", 00:14:58.690 "assigned_rate_limits": { 00:14:58.690 "rw_ios_per_sec": 0, 00:14:58.690 "rw_mbytes_per_sec": 0, 00:14:58.690 "r_mbytes_per_sec": 0, 00:14:58.690 "w_mbytes_per_sec": 0 00:14:58.690 }, 00:14:58.690 "claimed": true, 00:14:58.690 "claim_type": "exclusive_write", 00:14:58.690 "zoned": false, 00:14:58.690 "supported_io_types": { 00:14:58.690 "read": true, 00:14:58.690 "write": true, 00:14:58.690 "unmap": true, 00:14:58.690 "flush": true, 00:14:58.690 "reset": true, 00:14:58.690 "nvme_admin": false, 00:14:58.690 "nvme_io": false, 00:14:58.690 "nvme_io_md": false, 00:14:58.690 "write_zeroes": true, 00:14:58.690 "zcopy": true, 00:14:58.690 "get_zone_info": false, 00:14:58.690 "zone_management": false, 00:14:58.690 "zone_append": false, 00:14:58.690 "compare": false, 00:14:58.690 "compare_and_write": false, 00:14:58.690 "abort": true, 00:14:58.690 "seek_hole": false, 00:14:58.690 "seek_data": false, 00:14:58.690 "copy": true, 00:14:58.690 "nvme_iov_md": false 00:14:58.690 }, 00:14:58.690 "memory_domains": [ 00:14:58.690 { 00:14:58.690 "dma_device_id": "system", 00:14:58.690 "dma_device_type": 1 00:14:58.690 }, 00:14:58.690 { 00:14:58.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.690 "dma_device_type": 2 00:14:58.690 } 00:14:58.690 ], 00:14:58.690 "driver_specific": {} 00:14:58.690 }' 00:14:58.690 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.690 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.690 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.690 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.690 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.947 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.947 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.947 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.947 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.947 10:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.947 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.947 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.947 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.947 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:58.947 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:59.204 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:59.204 "name": "BaseBdev3", 00:14:59.204 "aliases": [ 00:14:59.204 "9b02dc69-5d98-4983-96bf-4b98190b83cc" 00:14:59.204 ], 00:14:59.204 "product_name": "Malloc disk", 00:14:59.204 "block_size": 512, 00:14:59.204 "num_blocks": 65536, 00:14:59.204 "uuid": "9b02dc69-5d98-4983-96bf-4b98190b83cc", 00:14:59.204 "assigned_rate_limits": { 00:14:59.204 "rw_ios_per_sec": 0, 00:14:59.204 "rw_mbytes_per_sec": 0, 00:14:59.204 "r_mbytes_per_sec": 0, 00:14:59.204 "w_mbytes_per_sec": 0 00:14:59.204 }, 00:14:59.204 "claimed": true, 00:14:59.204 "claim_type": "exclusive_write", 00:14:59.204 "zoned": false, 00:14:59.204 "supported_io_types": { 00:14:59.204 "read": true, 00:14:59.204 "write": true, 00:14:59.204 "unmap": true, 00:14:59.204 "flush": true, 00:14:59.204 "reset": true, 00:14:59.204 "nvme_admin": false, 00:14:59.204 "nvme_io": false, 00:14:59.204 "nvme_io_md": false, 00:14:59.204 "write_zeroes": true, 00:14:59.204 "zcopy": true, 00:14:59.204 "get_zone_info": false, 00:14:59.204 "zone_management": false, 00:14:59.204 "zone_append": false, 00:14:59.204 "compare": false, 00:14:59.204 "compare_and_write": false, 00:14:59.204 "abort": true, 00:14:59.204 "seek_hole": false, 00:14:59.204 "seek_data": false, 00:14:59.204 "copy": true, 00:14:59.204 "nvme_iov_md": false 00:14:59.204 }, 00:14:59.204 "memory_domains": [ 00:14:59.204 { 00:14:59.204 "dma_device_id": "system", 00:14:59.204 "dma_device_type": 1 00:14:59.204 }, 00:14:59.204 { 00:14:59.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.204 "dma_device_type": 2 00:14:59.204 } 00:14:59.204 ], 00:14:59.204 "driver_specific": {} 00:14:59.204 }' 00:14:59.204 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.204 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.462 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:59.462 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.462 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.462 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:59.462 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.462 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.462 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.462 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.462 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.720 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.720 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:59.720 [2024-07-12 10:41:34.898673] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:59.720 [2024-07-12 10:41:34.898706] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:59.720 [2024-07-12 10:41:34.898759] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:59.720 [2024-07-12 10:41:34.898807] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:59.720 [2024-07-12 10:41:34.898818] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f72450 name Existed_Raid, state offline 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2048381 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2048381 ']' 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2048381 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2048381 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2048381' 00:14:59.977 killing process with pid 2048381 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2048381 00:14:59.977 [2024-07-12 10:41:34.969670] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:59.977 10:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2048381 00:14:59.977 [2024-07-12 10:41:34.995333] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:00.234 10:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:00.234 00:15:00.234 real 0m28.870s 00:15:00.234 user 0m53.048s 00:15:00.234 sys 0m5.142s 00:15:00.234 10:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:00.234 10:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.234 ************************************ 00:15:00.234 END TEST raid_state_function_test 00:15:00.234 ************************************ 00:15:00.234 10:41:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:00.234 10:41:35 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:00.234 10:41:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:00.234 10:41:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:00.234 10:41:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:00.234 ************************************ 00:15:00.234 START TEST raid_state_function_test_sb 00:15:00.234 ************************************ 00:15:00.234 10:41:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:00.234 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2052845 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2052845' 00:15:00.235 Process raid pid: 2052845 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2052845 /var/tmp/spdk-raid.sock 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2052845 ']' 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:00.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:00.235 10:41:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:00.235 [2024-07-12 10:41:35.352070] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:15:00.235 [2024-07-12 10:41:35.352131] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:00.492 [2024-07-12 10:41:35.475986] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:00.492 [2024-07-12 10:41:35.581878] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:00.492 [2024-07-12 10:41:35.646308] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:00.492 [2024-07-12 10:41:35.646342] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:01.426 [2024-07-12 10:41:36.513923] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:01.426 [2024-07-12 10:41:36.513964] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:01.426 [2024-07-12 10:41:36.513975] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:01.426 [2024-07-12 10:41:36.513986] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:01.426 [2024-07-12 10:41:36.513995] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:01.426 [2024-07-12 10:41:36.514006] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.426 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:01.683 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.683 "name": "Existed_Raid", 00:15:01.683 "uuid": "3e65c996-c9a8-4970-874d-6c0164fd7c41", 00:15:01.683 "strip_size_kb": 64, 00:15:01.683 "state": "configuring", 00:15:01.683 "raid_level": "concat", 00:15:01.683 "superblock": true, 00:15:01.683 "num_base_bdevs": 3, 00:15:01.683 "num_base_bdevs_discovered": 0, 00:15:01.683 "num_base_bdevs_operational": 3, 00:15:01.683 "base_bdevs_list": [ 00:15:01.683 { 00:15:01.683 "name": "BaseBdev1", 00:15:01.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.683 "is_configured": false, 00:15:01.683 "data_offset": 0, 00:15:01.683 "data_size": 0 00:15:01.683 }, 00:15:01.683 { 00:15:01.683 "name": "BaseBdev2", 00:15:01.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.683 "is_configured": false, 00:15:01.683 "data_offset": 0, 00:15:01.683 "data_size": 0 00:15:01.683 }, 00:15:01.683 { 00:15:01.683 "name": "BaseBdev3", 00:15:01.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.683 "is_configured": false, 00:15:01.683 "data_offset": 0, 00:15:01.683 "data_size": 0 00:15:01.683 } 00:15:01.683 ] 00:15:01.683 }' 00:15:01.683 10:41:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.683 10:41:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:02.249 10:41:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:02.509 [2024-07-12 10:41:37.596626] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:02.509 [2024-07-12 10:41:37.596655] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x222ca80 name Existed_Raid, state configuring 00:15:02.510 10:41:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:02.785 [2024-07-12 10:41:37.785161] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:02.785 [2024-07-12 10:41:37.785192] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:02.785 [2024-07-12 10:41:37.785206] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:02.785 [2024-07-12 10:41:37.785218] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:02.785 [2024-07-12 10:41:37.785226] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:02.786 [2024-07-12 10:41:37.785237] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:02.786 10:41:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:03.057 [2024-07-12 10:41:38.039643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:03.057 BaseBdev1 00:15:03.057 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:03.057 10:41:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:03.057 10:41:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:03.057 10:41:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:03.057 10:41:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:03.057 10:41:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:03.057 10:41:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:03.331 10:41:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:03.589 [ 00:15:03.589 { 00:15:03.589 "name": "BaseBdev1", 00:15:03.589 "aliases": [ 00:15:03.589 "79831519-6e86-4e23-8a46-f02d2025da9c" 00:15:03.589 ], 00:15:03.589 "product_name": "Malloc disk", 00:15:03.589 "block_size": 512, 00:15:03.589 "num_blocks": 65536, 00:15:03.589 "uuid": "79831519-6e86-4e23-8a46-f02d2025da9c", 00:15:03.589 "assigned_rate_limits": { 00:15:03.589 "rw_ios_per_sec": 0, 00:15:03.589 "rw_mbytes_per_sec": 0, 00:15:03.589 "r_mbytes_per_sec": 0, 00:15:03.589 "w_mbytes_per_sec": 0 00:15:03.589 }, 00:15:03.589 "claimed": true, 00:15:03.589 "claim_type": "exclusive_write", 00:15:03.589 "zoned": false, 00:15:03.589 "supported_io_types": { 00:15:03.589 "read": true, 00:15:03.589 "write": true, 00:15:03.589 "unmap": true, 00:15:03.589 "flush": true, 00:15:03.589 "reset": true, 00:15:03.589 "nvme_admin": false, 00:15:03.589 "nvme_io": false, 00:15:03.589 "nvme_io_md": false, 00:15:03.589 "write_zeroes": true, 00:15:03.589 "zcopy": true, 00:15:03.589 "get_zone_info": false, 00:15:03.589 "zone_management": false, 00:15:03.589 "zone_append": false, 00:15:03.589 "compare": false, 00:15:03.589 "compare_and_write": false, 00:15:03.589 "abort": true, 00:15:03.589 "seek_hole": false, 00:15:03.589 "seek_data": false, 00:15:03.589 "copy": true, 00:15:03.589 "nvme_iov_md": false 00:15:03.589 }, 00:15:03.589 "memory_domains": [ 00:15:03.589 { 00:15:03.589 "dma_device_id": "system", 00:15:03.589 "dma_device_type": 1 00:15:03.589 }, 00:15:03.589 { 00:15:03.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.589 "dma_device_type": 2 00:15:03.589 } 00:15:03.589 ], 00:15:03.589 "driver_specific": {} 00:15:03.589 } 00:15:03.589 ] 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.589 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:03.848 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.848 "name": "Existed_Raid", 00:15:03.848 "uuid": "1ce9b72d-5521-4b6d-abcf-9ecc426a7579", 00:15:03.848 "strip_size_kb": 64, 00:15:03.848 "state": "configuring", 00:15:03.848 "raid_level": "concat", 00:15:03.848 "superblock": true, 00:15:03.848 "num_base_bdevs": 3, 00:15:03.848 "num_base_bdevs_discovered": 1, 00:15:03.848 "num_base_bdevs_operational": 3, 00:15:03.848 "base_bdevs_list": [ 00:15:03.848 { 00:15:03.848 "name": "BaseBdev1", 00:15:03.848 "uuid": "79831519-6e86-4e23-8a46-f02d2025da9c", 00:15:03.848 "is_configured": true, 00:15:03.848 "data_offset": 2048, 00:15:03.848 "data_size": 63488 00:15:03.848 }, 00:15:03.848 { 00:15:03.848 "name": "BaseBdev2", 00:15:03.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.848 "is_configured": false, 00:15:03.848 "data_offset": 0, 00:15:03.848 "data_size": 0 00:15:03.848 }, 00:15:03.848 { 00:15:03.848 "name": "BaseBdev3", 00:15:03.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.848 "is_configured": false, 00:15:03.848 "data_offset": 0, 00:15:03.848 "data_size": 0 00:15:03.848 } 00:15:03.848 ] 00:15:03.848 }' 00:15:03.848 10:41:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.848 10:41:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:04.415 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:04.415 [2024-07-12 10:41:39.587730] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:04.415 [2024-07-12 10:41:39.587768] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x222c310 name Existed_Raid, state configuring 00:15:04.415 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:04.673 [2024-07-12 10:41:39.816379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:04.673 [2024-07-12 10:41:39.817824] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:04.674 [2024-07-12 10:41:39.817856] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:04.674 [2024-07-12 10:41:39.817866] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:04.674 [2024-07-12 10:41:39.817877] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.674 10:41:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:04.932 10:41:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.932 "name": "Existed_Raid", 00:15:04.932 "uuid": "8a878889-789b-4915-a411-21e514ef840b", 00:15:04.932 "strip_size_kb": 64, 00:15:04.932 "state": "configuring", 00:15:04.932 "raid_level": "concat", 00:15:04.932 "superblock": true, 00:15:04.932 "num_base_bdevs": 3, 00:15:04.932 "num_base_bdevs_discovered": 1, 00:15:04.932 "num_base_bdevs_operational": 3, 00:15:04.932 "base_bdevs_list": [ 00:15:04.932 { 00:15:04.932 "name": "BaseBdev1", 00:15:04.932 "uuid": "79831519-6e86-4e23-8a46-f02d2025da9c", 00:15:04.932 "is_configured": true, 00:15:04.932 "data_offset": 2048, 00:15:04.932 "data_size": 63488 00:15:04.932 }, 00:15:04.932 { 00:15:04.932 "name": "BaseBdev2", 00:15:04.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.932 "is_configured": false, 00:15:04.932 "data_offset": 0, 00:15:04.932 "data_size": 0 00:15:04.932 }, 00:15:04.932 { 00:15:04.932 "name": "BaseBdev3", 00:15:04.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.932 "is_configured": false, 00:15:04.932 "data_offset": 0, 00:15:04.932 "data_size": 0 00:15:04.932 } 00:15:04.932 ] 00:15:04.932 }' 00:15:04.932 10:41:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.932 10:41:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:05.498 10:41:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:05.756 [2024-07-12 10:41:40.902579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:05.756 BaseBdev2 00:15:05.756 10:41:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:05.756 10:41:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:05.756 10:41:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:05.756 10:41:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:05.756 10:41:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:05.756 10:41:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:05.756 10:41:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:06.015 10:41:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:06.275 [ 00:15:06.275 { 00:15:06.275 "name": "BaseBdev2", 00:15:06.275 "aliases": [ 00:15:06.275 "ecf07137-5989-498b-aba1-f30ac227e3c6" 00:15:06.275 ], 00:15:06.275 "product_name": "Malloc disk", 00:15:06.275 "block_size": 512, 00:15:06.275 "num_blocks": 65536, 00:15:06.275 "uuid": "ecf07137-5989-498b-aba1-f30ac227e3c6", 00:15:06.275 "assigned_rate_limits": { 00:15:06.275 "rw_ios_per_sec": 0, 00:15:06.275 "rw_mbytes_per_sec": 0, 00:15:06.275 "r_mbytes_per_sec": 0, 00:15:06.275 "w_mbytes_per_sec": 0 00:15:06.275 }, 00:15:06.275 "claimed": true, 00:15:06.275 "claim_type": "exclusive_write", 00:15:06.275 "zoned": false, 00:15:06.275 "supported_io_types": { 00:15:06.275 "read": true, 00:15:06.275 "write": true, 00:15:06.275 "unmap": true, 00:15:06.275 "flush": true, 00:15:06.275 "reset": true, 00:15:06.275 "nvme_admin": false, 00:15:06.275 "nvme_io": false, 00:15:06.275 "nvme_io_md": false, 00:15:06.275 "write_zeroes": true, 00:15:06.275 "zcopy": true, 00:15:06.275 "get_zone_info": false, 00:15:06.275 "zone_management": false, 00:15:06.275 "zone_append": false, 00:15:06.275 "compare": false, 00:15:06.275 "compare_and_write": false, 00:15:06.275 "abort": true, 00:15:06.275 "seek_hole": false, 00:15:06.275 "seek_data": false, 00:15:06.275 "copy": true, 00:15:06.275 "nvme_iov_md": false 00:15:06.275 }, 00:15:06.275 "memory_domains": [ 00:15:06.275 { 00:15:06.275 "dma_device_id": "system", 00:15:06.275 "dma_device_type": 1 00:15:06.275 }, 00:15:06.275 { 00:15:06.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.275 "dma_device_type": 2 00:15:06.275 } 00:15:06.275 ], 00:15:06.275 "driver_specific": {} 00:15:06.275 } 00:15:06.275 ] 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.275 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.534 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.534 "name": "Existed_Raid", 00:15:06.534 "uuid": "8a878889-789b-4915-a411-21e514ef840b", 00:15:06.534 "strip_size_kb": 64, 00:15:06.534 "state": "configuring", 00:15:06.534 "raid_level": "concat", 00:15:06.534 "superblock": true, 00:15:06.534 "num_base_bdevs": 3, 00:15:06.534 "num_base_bdevs_discovered": 2, 00:15:06.534 "num_base_bdevs_operational": 3, 00:15:06.534 "base_bdevs_list": [ 00:15:06.534 { 00:15:06.534 "name": "BaseBdev1", 00:15:06.534 "uuid": "79831519-6e86-4e23-8a46-f02d2025da9c", 00:15:06.534 "is_configured": true, 00:15:06.534 "data_offset": 2048, 00:15:06.534 "data_size": 63488 00:15:06.534 }, 00:15:06.534 { 00:15:06.534 "name": "BaseBdev2", 00:15:06.534 "uuid": "ecf07137-5989-498b-aba1-f30ac227e3c6", 00:15:06.534 "is_configured": true, 00:15:06.534 "data_offset": 2048, 00:15:06.534 "data_size": 63488 00:15:06.534 }, 00:15:06.534 { 00:15:06.534 "name": "BaseBdev3", 00:15:06.534 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.534 "is_configured": false, 00:15:06.534 "data_offset": 0, 00:15:06.534 "data_size": 0 00:15:06.534 } 00:15:06.534 ] 00:15:06.534 }' 00:15:06.534 10:41:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.534 10:41:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.102 10:41:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:07.374 [2024-07-12 10:41:42.498267] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:07.374 [2024-07-12 10:41:42.498429] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x222d400 00:15:07.374 [2024-07-12 10:41:42.498443] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:07.374 [2024-07-12 10:41:42.498623] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x222cef0 00:15:07.374 [2024-07-12 10:41:42.498737] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x222d400 00:15:07.374 [2024-07-12 10:41:42.498747] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x222d400 00:15:07.374 [2024-07-12 10:41:42.498835] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:07.374 BaseBdev3 00:15:07.374 10:41:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:07.374 10:41:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:07.374 10:41:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:07.374 10:41:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:07.374 10:41:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:07.374 10:41:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:07.374 10:41:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:07.633 10:41:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:07.896 [ 00:15:07.896 { 00:15:07.896 "name": "BaseBdev3", 00:15:07.896 "aliases": [ 00:15:07.896 "f3a2040d-140b-41f6-96ee-2d968910a4bd" 00:15:07.896 ], 00:15:07.896 "product_name": "Malloc disk", 00:15:07.896 "block_size": 512, 00:15:07.896 "num_blocks": 65536, 00:15:07.896 "uuid": "f3a2040d-140b-41f6-96ee-2d968910a4bd", 00:15:07.896 "assigned_rate_limits": { 00:15:07.896 "rw_ios_per_sec": 0, 00:15:07.896 "rw_mbytes_per_sec": 0, 00:15:07.896 "r_mbytes_per_sec": 0, 00:15:07.896 "w_mbytes_per_sec": 0 00:15:07.896 }, 00:15:07.896 "claimed": true, 00:15:07.896 "claim_type": "exclusive_write", 00:15:07.896 "zoned": false, 00:15:07.896 "supported_io_types": { 00:15:07.896 "read": true, 00:15:07.896 "write": true, 00:15:07.896 "unmap": true, 00:15:07.896 "flush": true, 00:15:07.896 "reset": true, 00:15:07.896 "nvme_admin": false, 00:15:07.896 "nvme_io": false, 00:15:07.896 "nvme_io_md": false, 00:15:07.896 "write_zeroes": true, 00:15:07.896 "zcopy": true, 00:15:07.896 "get_zone_info": false, 00:15:07.896 "zone_management": false, 00:15:07.896 "zone_append": false, 00:15:07.896 "compare": false, 00:15:07.896 "compare_and_write": false, 00:15:07.896 "abort": true, 00:15:07.896 "seek_hole": false, 00:15:07.897 "seek_data": false, 00:15:07.897 "copy": true, 00:15:07.897 "nvme_iov_md": false 00:15:07.897 }, 00:15:07.897 "memory_domains": [ 00:15:07.897 { 00:15:07.897 "dma_device_id": "system", 00:15:07.897 "dma_device_type": 1 00:15:07.897 }, 00:15:07.897 { 00:15:07.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:07.897 "dma_device_type": 2 00:15:07.897 } 00:15:07.897 ], 00:15:07.897 "driver_specific": {} 00:15:07.897 } 00:15:07.897 ] 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.897 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:08.161 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.161 "name": "Existed_Raid", 00:15:08.161 "uuid": "8a878889-789b-4915-a411-21e514ef840b", 00:15:08.161 "strip_size_kb": 64, 00:15:08.161 "state": "online", 00:15:08.161 "raid_level": "concat", 00:15:08.161 "superblock": true, 00:15:08.161 "num_base_bdevs": 3, 00:15:08.161 "num_base_bdevs_discovered": 3, 00:15:08.161 "num_base_bdevs_operational": 3, 00:15:08.161 "base_bdevs_list": [ 00:15:08.161 { 00:15:08.161 "name": "BaseBdev1", 00:15:08.161 "uuid": "79831519-6e86-4e23-8a46-f02d2025da9c", 00:15:08.161 "is_configured": true, 00:15:08.161 "data_offset": 2048, 00:15:08.161 "data_size": 63488 00:15:08.161 }, 00:15:08.161 { 00:15:08.161 "name": "BaseBdev2", 00:15:08.161 "uuid": "ecf07137-5989-498b-aba1-f30ac227e3c6", 00:15:08.161 "is_configured": true, 00:15:08.161 "data_offset": 2048, 00:15:08.161 "data_size": 63488 00:15:08.161 }, 00:15:08.161 { 00:15:08.161 "name": "BaseBdev3", 00:15:08.161 "uuid": "f3a2040d-140b-41f6-96ee-2d968910a4bd", 00:15:08.161 "is_configured": true, 00:15:08.161 "data_offset": 2048, 00:15:08.161 "data_size": 63488 00:15:08.161 } 00:15:08.161 ] 00:15:08.161 }' 00:15:08.161 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.161 10:41:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:08.729 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:08.729 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:08.729 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:08.729 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:08.729 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:08.729 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:08.729 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:08.729 10:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:08.988 [2024-07-12 10:41:44.050677] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:08.988 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:08.988 "name": "Existed_Raid", 00:15:08.988 "aliases": [ 00:15:08.988 "8a878889-789b-4915-a411-21e514ef840b" 00:15:08.988 ], 00:15:08.988 "product_name": "Raid Volume", 00:15:08.988 "block_size": 512, 00:15:08.988 "num_blocks": 190464, 00:15:08.988 "uuid": "8a878889-789b-4915-a411-21e514ef840b", 00:15:08.988 "assigned_rate_limits": { 00:15:08.988 "rw_ios_per_sec": 0, 00:15:08.988 "rw_mbytes_per_sec": 0, 00:15:08.988 "r_mbytes_per_sec": 0, 00:15:08.988 "w_mbytes_per_sec": 0 00:15:08.988 }, 00:15:08.988 "claimed": false, 00:15:08.988 "zoned": false, 00:15:08.988 "supported_io_types": { 00:15:08.988 "read": true, 00:15:08.988 "write": true, 00:15:08.988 "unmap": true, 00:15:08.988 "flush": true, 00:15:08.988 "reset": true, 00:15:08.988 "nvme_admin": false, 00:15:08.988 "nvme_io": false, 00:15:08.988 "nvme_io_md": false, 00:15:08.988 "write_zeroes": true, 00:15:08.988 "zcopy": false, 00:15:08.988 "get_zone_info": false, 00:15:08.988 "zone_management": false, 00:15:08.988 "zone_append": false, 00:15:08.988 "compare": false, 00:15:08.988 "compare_and_write": false, 00:15:08.988 "abort": false, 00:15:08.988 "seek_hole": false, 00:15:08.988 "seek_data": false, 00:15:08.988 "copy": false, 00:15:08.988 "nvme_iov_md": false 00:15:08.988 }, 00:15:08.988 "memory_domains": [ 00:15:08.988 { 00:15:08.988 "dma_device_id": "system", 00:15:08.988 "dma_device_type": 1 00:15:08.988 }, 00:15:08.988 { 00:15:08.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.988 "dma_device_type": 2 00:15:08.988 }, 00:15:08.988 { 00:15:08.988 "dma_device_id": "system", 00:15:08.989 "dma_device_type": 1 00:15:08.989 }, 00:15:08.989 { 00:15:08.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.989 "dma_device_type": 2 00:15:08.989 }, 00:15:08.989 { 00:15:08.989 "dma_device_id": "system", 00:15:08.989 "dma_device_type": 1 00:15:08.989 }, 00:15:08.989 { 00:15:08.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.989 "dma_device_type": 2 00:15:08.989 } 00:15:08.989 ], 00:15:08.989 "driver_specific": { 00:15:08.989 "raid": { 00:15:08.989 "uuid": "8a878889-789b-4915-a411-21e514ef840b", 00:15:08.989 "strip_size_kb": 64, 00:15:08.989 "state": "online", 00:15:08.989 "raid_level": "concat", 00:15:08.989 "superblock": true, 00:15:08.989 "num_base_bdevs": 3, 00:15:08.989 "num_base_bdevs_discovered": 3, 00:15:08.989 "num_base_bdevs_operational": 3, 00:15:08.989 "base_bdevs_list": [ 00:15:08.989 { 00:15:08.989 "name": "BaseBdev1", 00:15:08.989 "uuid": "79831519-6e86-4e23-8a46-f02d2025da9c", 00:15:08.989 "is_configured": true, 00:15:08.989 "data_offset": 2048, 00:15:08.989 "data_size": 63488 00:15:08.989 }, 00:15:08.989 { 00:15:08.989 "name": "BaseBdev2", 00:15:08.989 "uuid": "ecf07137-5989-498b-aba1-f30ac227e3c6", 00:15:08.989 "is_configured": true, 00:15:08.989 "data_offset": 2048, 00:15:08.989 "data_size": 63488 00:15:08.989 }, 00:15:08.989 { 00:15:08.989 "name": "BaseBdev3", 00:15:08.989 "uuid": "f3a2040d-140b-41f6-96ee-2d968910a4bd", 00:15:08.989 "is_configured": true, 00:15:08.989 "data_offset": 2048, 00:15:08.989 "data_size": 63488 00:15:08.989 } 00:15:08.989 ] 00:15:08.989 } 00:15:08.989 } 00:15:08.989 }' 00:15:08.989 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:08.989 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:08.989 BaseBdev2 00:15:08.989 BaseBdev3' 00:15:08.989 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:08.989 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:08.989 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.248 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.248 "name": "BaseBdev1", 00:15:09.248 "aliases": [ 00:15:09.248 "79831519-6e86-4e23-8a46-f02d2025da9c" 00:15:09.248 ], 00:15:09.248 "product_name": "Malloc disk", 00:15:09.248 "block_size": 512, 00:15:09.248 "num_blocks": 65536, 00:15:09.248 "uuid": "79831519-6e86-4e23-8a46-f02d2025da9c", 00:15:09.248 "assigned_rate_limits": { 00:15:09.248 "rw_ios_per_sec": 0, 00:15:09.248 "rw_mbytes_per_sec": 0, 00:15:09.248 "r_mbytes_per_sec": 0, 00:15:09.248 "w_mbytes_per_sec": 0 00:15:09.248 }, 00:15:09.248 "claimed": true, 00:15:09.248 "claim_type": "exclusive_write", 00:15:09.248 "zoned": false, 00:15:09.248 "supported_io_types": { 00:15:09.248 "read": true, 00:15:09.248 "write": true, 00:15:09.248 "unmap": true, 00:15:09.248 "flush": true, 00:15:09.248 "reset": true, 00:15:09.248 "nvme_admin": false, 00:15:09.248 "nvme_io": false, 00:15:09.248 "nvme_io_md": false, 00:15:09.248 "write_zeroes": true, 00:15:09.248 "zcopy": true, 00:15:09.248 "get_zone_info": false, 00:15:09.248 "zone_management": false, 00:15:09.248 "zone_append": false, 00:15:09.248 "compare": false, 00:15:09.248 "compare_and_write": false, 00:15:09.248 "abort": true, 00:15:09.248 "seek_hole": false, 00:15:09.248 "seek_data": false, 00:15:09.248 "copy": true, 00:15:09.248 "nvme_iov_md": false 00:15:09.248 }, 00:15:09.248 "memory_domains": [ 00:15:09.248 { 00:15:09.248 "dma_device_id": "system", 00:15:09.248 "dma_device_type": 1 00:15:09.248 }, 00:15:09.248 { 00:15:09.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.248 "dma_device_type": 2 00:15:09.248 } 00:15:09.248 ], 00:15:09.248 "driver_specific": {} 00:15:09.248 }' 00:15:09.248 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.248 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.507 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:09.507 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.507 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.507 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:09.507 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.507 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.507 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:09.507 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.507 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.766 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:09.766 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:09.766 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:09.766 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.766 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.766 "name": "BaseBdev2", 00:15:09.766 "aliases": [ 00:15:09.766 "ecf07137-5989-498b-aba1-f30ac227e3c6" 00:15:09.766 ], 00:15:09.766 "product_name": "Malloc disk", 00:15:09.766 "block_size": 512, 00:15:09.766 "num_blocks": 65536, 00:15:09.766 "uuid": "ecf07137-5989-498b-aba1-f30ac227e3c6", 00:15:09.766 "assigned_rate_limits": { 00:15:09.766 "rw_ios_per_sec": 0, 00:15:09.766 "rw_mbytes_per_sec": 0, 00:15:09.766 "r_mbytes_per_sec": 0, 00:15:09.766 "w_mbytes_per_sec": 0 00:15:09.766 }, 00:15:09.766 "claimed": true, 00:15:09.766 "claim_type": "exclusive_write", 00:15:09.766 "zoned": false, 00:15:09.766 "supported_io_types": { 00:15:09.766 "read": true, 00:15:09.766 "write": true, 00:15:09.766 "unmap": true, 00:15:09.766 "flush": true, 00:15:09.766 "reset": true, 00:15:09.766 "nvme_admin": false, 00:15:09.766 "nvme_io": false, 00:15:09.766 "nvme_io_md": false, 00:15:09.766 "write_zeroes": true, 00:15:09.766 "zcopy": true, 00:15:09.766 "get_zone_info": false, 00:15:09.767 "zone_management": false, 00:15:09.767 "zone_append": false, 00:15:09.767 "compare": false, 00:15:09.767 "compare_and_write": false, 00:15:09.767 "abort": true, 00:15:09.767 "seek_hole": false, 00:15:09.767 "seek_data": false, 00:15:09.767 "copy": true, 00:15:09.767 "nvme_iov_md": false 00:15:09.767 }, 00:15:09.767 "memory_domains": [ 00:15:09.767 { 00:15:09.767 "dma_device_id": "system", 00:15:09.767 "dma_device_type": 1 00:15:09.767 }, 00:15:09.767 { 00:15:09.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.767 "dma_device_type": 2 00:15:09.767 } 00:15:09.767 ], 00:15:09.767 "driver_specific": {} 00:15:09.767 }' 00:15:09.767 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.026 10:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.026 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:10.026 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.026 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.026 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:10.026 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.026 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.026 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.026 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.284 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.284 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.284 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:10.284 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:10.284 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:10.544 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:10.544 "name": "BaseBdev3", 00:15:10.544 "aliases": [ 00:15:10.544 "f3a2040d-140b-41f6-96ee-2d968910a4bd" 00:15:10.544 ], 00:15:10.544 "product_name": "Malloc disk", 00:15:10.544 "block_size": 512, 00:15:10.544 "num_blocks": 65536, 00:15:10.544 "uuid": "f3a2040d-140b-41f6-96ee-2d968910a4bd", 00:15:10.544 "assigned_rate_limits": { 00:15:10.544 "rw_ios_per_sec": 0, 00:15:10.544 "rw_mbytes_per_sec": 0, 00:15:10.544 "r_mbytes_per_sec": 0, 00:15:10.544 "w_mbytes_per_sec": 0 00:15:10.544 }, 00:15:10.544 "claimed": true, 00:15:10.544 "claim_type": "exclusive_write", 00:15:10.544 "zoned": false, 00:15:10.544 "supported_io_types": { 00:15:10.544 "read": true, 00:15:10.544 "write": true, 00:15:10.544 "unmap": true, 00:15:10.544 "flush": true, 00:15:10.544 "reset": true, 00:15:10.544 "nvme_admin": false, 00:15:10.544 "nvme_io": false, 00:15:10.544 "nvme_io_md": false, 00:15:10.544 "write_zeroes": true, 00:15:10.544 "zcopy": true, 00:15:10.544 "get_zone_info": false, 00:15:10.544 "zone_management": false, 00:15:10.544 "zone_append": false, 00:15:10.544 "compare": false, 00:15:10.544 "compare_and_write": false, 00:15:10.544 "abort": true, 00:15:10.544 "seek_hole": false, 00:15:10.544 "seek_data": false, 00:15:10.544 "copy": true, 00:15:10.544 "nvme_iov_md": false 00:15:10.544 }, 00:15:10.544 "memory_domains": [ 00:15:10.544 { 00:15:10.544 "dma_device_id": "system", 00:15:10.544 "dma_device_type": 1 00:15:10.544 }, 00:15:10.544 { 00:15:10.544 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.544 "dma_device_type": 2 00:15:10.544 } 00:15:10.544 ], 00:15:10.544 "driver_specific": {} 00:15:10.544 }' 00:15:10.544 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.544 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.544 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:10.544 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.544 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.544 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:10.544 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.803 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.803 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.803 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.803 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.803 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.803 10:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:11.063 [2024-07-12 10:41:46.099862] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:11.063 [2024-07-12 10:41:46.099887] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:11.063 [2024-07-12 10:41:46.099925] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.063 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.322 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.322 "name": "Existed_Raid", 00:15:11.322 "uuid": "8a878889-789b-4915-a411-21e514ef840b", 00:15:11.322 "strip_size_kb": 64, 00:15:11.322 "state": "offline", 00:15:11.322 "raid_level": "concat", 00:15:11.322 "superblock": true, 00:15:11.322 "num_base_bdevs": 3, 00:15:11.322 "num_base_bdevs_discovered": 2, 00:15:11.322 "num_base_bdevs_operational": 2, 00:15:11.322 "base_bdevs_list": [ 00:15:11.322 { 00:15:11.322 "name": null, 00:15:11.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.322 "is_configured": false, 00:15:11.322 "data_offset": 2048, 00:15:11.322 "data_size": 63488 00:15:11.322 }, 00:15:11.322 { 00:15:11.322 "name": "BaseBdev2", 00:15:11.322 "uuid": "ecf07137-5989-498b-aba1-f30ac227e3c6", 00:15:11.322 "is_configured": true, 00:15:11.322 "data_offset": 2048, 00:15:11.322 "data_size": 63488 00:15:11.322 }, 00:15:11.322 { 00:15:11.322 "name": "BaseBdev3", 00:15:11.322 "uuid": "f3a2040d-140b-41f6-96ee-2d968910a4bd", 00:15:11.322 "is_configured": true, 00:15:11.323 "data_offset": 2048, 00:15:11.323 "data_size": 63488 00:15:11.323 } 00:15:11.323 ] 00:15:11.323 }' 00:15:11.323 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.323 10:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.890 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:11.890 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:11.890 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.890 10:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:12.149 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:12.149 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:12.149 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:12.408 [2024-07-12 10:41:47.452447] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:12.408 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:12.408 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:12.408 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.408 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:12.667 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:12.667 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:12.667 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:12.926 [2024-07-12 10:41:47.953853] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:12.926 [2024-07-12 10:41:47.953893] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x222d400 name Existed_Raid, state offline 00:15:12.926 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:12.926 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:12.926 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.926 10:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:13.185 10:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:13.185 10:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:13.185 10:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:13.185 10:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:13.185 10:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:13.185 10:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:13.751 BaseBdev2 00:15:13.751 10:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:13.751 10:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:13.751 10:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:13.751 10:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:13.751 10:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:13.751 10:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:13.751 10:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:14.010 10:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:14.577 [ 00:15:14.577 { 00:15:14.577 "name": "BaseBdev2", 00:15:14.577 "aliases": [ 00:15:14.577 "a3468fc8-3402-4b4e-b7c6-00986967c2eb" 00:15:14.577 ], 00:15:14.577 "product_name": "Malloc disk", 00:15:14.577 "block_size": 512, 00:15:14.577 "num_blocks": 65536, 00:15:14.577 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:14.577 "assigned_rate_limits": { 00:15:14.577 "rw_ios_per_sec": 0, 00:15:14.577 "rw_mbytes_per_sec": 0, 00:15:14.577 "r_mbytes_per_sec": 0, 00:15:14.577 "w_mbytes_per_sec": 0 00:15:14.577 }, 00:15:14.577 "claimed": false, 00:15:14.577 "zoned": false, 00:15:14.577 "supported_io_types": { 00:15:14.577 "read": true, 00:15:14.577 "write": true, 00:15:14.577 "unmap": true, 00:15:14.577 "flush": true, 00:15:14.577 "reset": true, 00:15:14.577 "nvme_admin": false, 00:15:14.577 "nvme_io": false, 00:15:14.577 "nvme_io_md": false, 00:15:14.577 "write_zeroes": true, 00:15:14.577 "zcopy": true, 00:15:14.577 "get_zone_info": false, 00:15:14.577 "zone_management": false, 00:15:14.577 "zone_append": false, 00:15:14.577 "compare": false, 00:15:14.577 "compare_and_write": false, 00:15:14.577 "abort": true, 00:15:14.577 "seek_hole": false, 00:15:14.577 "seek_data": false, 00:15:14.577 "copy": true, 00:15:14.577 "nvme_iov_md": false 00:15:14.577 }, 00:15:14.577 "memory_domains": [ 00:15:14.577 { 00:15:14.577 "dma_device_id": "system", 00:15:14.577 "dma_device_type": 1 00:15:14.577 }, 00:15:14.577 { 00:15:14.577 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.577 "dma_device_type": 2 00:15:14.577 } 00:15:14.577 ], 00:15:14.577 "driver_specific": {} 00:15:14.577 } 00:15:14.577 ] 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:14.577 BaseBdev3 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:14.577 10:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.145 10:41:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:15.431 [ 00:15:15.431 { 00:15:15.431 "name": "BaseBdev3", 00:15:15.431 "aliases": [ 00:15:15.431 "8997ffca-f352-4691-b52c-b4de868b3fb7" 00:15:15.431 ], 00:15:15.431 "product_name": "Malloc disk", 00:15:15.431 "block_size": 512, 00:15:15.431 "num_blocks": 65536, 00:15:15.431 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:15.431 "assigned_rate_limits": { 00:15:15.431 "rw_ios_per_sec": 0, 00:15:15.431 "rw_mbytes_per_sec": 0, 00:15:15.431 "r_mbytes_per_sec": 0, 00:15:15.431 "w_mbytes_per_sec": 0 00:15:15.431 }, 00:15:15.431 "claimed": false, 00:15:15.431 "zoned": false, 00:15:15.431 "supported_io_types": { 00:15:15.431 "read": true, 00:15:15.431 "write": true, 00:15:15.431 "unmap": true, 00:15:15.431 "flush": true, 00:15:15.431 "reset": true, 00:15:15.431 "nvme_admin": false, 00:15:15.431 "nvme_io": false, 00:15:15.431 "nvme_io_md": false, 00:15:15.431 "write_zeroes": true, 00:15:15.431 "zcopy": true, 00:15:15.431 "get_zone_info": false, 00:15:15.431 "zone_management": false, 00:15:15.431 "zone_append": false, 00:15:15.431 "compare": false, 00:15:15.431 "compare_and_write": false, 00:15:15.431 "abort": true, 00:15:15.431 "seek_hole": false, 00:15:15.431 "seek_data": false, 00:15:15.431 "copy": true, 00:15:15.431 "nvme_iov_md": false 00:15:15.431 }, 00:15:15.431 "memory_domains": [ 00:15:15.431 { 00:15:15.431 "dma_device_id": "system", 00:15:15.431 "dma_device_type": 1 00:15:15.431 }, 00:15:15.431 { 00:15:15.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.432 "dma_device_type": 2 00:15:15.432 } 00:15:15.432 ], 00:15:15.432 "driver_specific": {} 00:15:15.432 } 00:15:15.432 ] 00:15:15.432 10:41:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:15.432 10:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:15.432 10:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:15.432 10:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:16.009 [2024-07-12 10:41:51.005237] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:16.009 [2024-07-12 10:41:51.005277] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:16.009 [2024-07-12 10:41:51.005297] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:16.009 [2024-07-12 10:41:51.006639] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.009 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.269 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.269 "name": "Existed_Raid", 00:15:16.269 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:16.269 "strip_size_kb": 64, 00:15:16.269 "state": "configuring", 00:15:16.269 "raid_level": "concat", 00:15:16.269 "superblock": true, 00:15:16.269 "num_base_bdevs": 3, 00:15:16.269 "num_base_bdevs_discovered": 2, 00:15:16.269 "num_base_bdevs_operational": 3, 00:15:16.269 "base_bdevs_list": [ 00:15:16.269 { 00:15:16.269 "name": "BaseBdev1", 00:15:16.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.269 "is_configured": false, 00:15:16.269 "data_offset": 0, 00:15:16.269 "data_size": 0 00:15:16.269 }, 00:15:16.269 { 00:15:16.269 "name": "BaseBdev2", 00:15:16.269 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:16.269 "is_configured": true, 00:15:16.269 "data_offset": 2048, 00:15:16.269 "data_size": 63488 00:15:16.269 }, 00:15:16.269 { 00:15:16.269 "name": "BaseBdev3", 00:15:16.269 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:16.269 "is_configured": true, 00:15:16.269 "data_offset": 2048, 00:15:16.269 "data_size": 63488 00:15:16.269 } 00:15:16.269 ] 00:15:16.269 }' 00:15:16.269 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.269 10:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:16.836 10:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:17.095 [2024-07-12 10:41:52.068016] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.095 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.353 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.353 "name": "Existed_Raid", 00:15:17.353 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:17.353 "strip_size_kb": 64, 00:15:17.353 "state": "configuring", 00:15:17.353 "raid_level": "concat", 00:15:17.353 "superblock": true, 00:15:17.353 "num_base_bdevs": 3, 00:15:17.353 "num_base_bdevs_discovered": 1, 00:15:17.353 "num_base_bdevs_operational": 3, 00:15:17.353 "base_bdevs_list": [ 00:15:17.353 { 00:15:17.353 "name": "BaseBdev1", 00:15:17.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.353 "is_configured": false, 00:15:17.353 "data_offset": 0, 00:15:17.353 "data_size": 0 00:15:17.353 }, 00:15:17.353 { 00:15:17.353 "name": null, 00:15:17.354 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:17.354 "is_configured": false, 00:15:17.354 "data_offset": 2048, 00:15:17.354 "data_size": 63488 00:15:17.354 }, 00:15:17.354 { 00:15:17.354 "name": "BaseBdev3", 00:15:17.354 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:17.354 "is_configured": true, 00:15:17.354 "data_offset": 2048, 00:15:17.354 "data_size": 63488 00:15:17.354 } 00:15:17.354 ] 00:15:17.354 }' 00:15:17.354 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.354 10:41:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:17.920 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:17.920 10:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.177 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:18.177 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:18.435 [2024-07-12 10:41:53.422991] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:18.435 BaseBdev1 00:15:18.435 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:18.435 10:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:18.435 10:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:18.435 10:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:18.435 10:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:18.435 10:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:18.435 10:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:18.693 10:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:18.952 [ 00:15:18.952 { 00:15:18.952 "name": "BaseBdev1", 00:15:18.952 "aliases": [ 00:15:18.952 "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496" 00:15:18.952 ], 00:15:18.952 "product_name": "Malloc disk", 00:15:18.952 "block_size": 512, 00:15:18.952 "num_blocks": 65536, 00:15:18.952 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:18.952 "assigned_rate_limits": { 00:15:18.952 "rw_ios_per_sec": 0, 00:15:18.952 "rw_mbytes_per_sec": 0, 00:15:18.952 "r_mbytes_per_sec": 0, 00:15:18.952 "w_mbytes_per_sec": 0 00:15:18.952 }, 00:15:18.952 "claimed": true, 00:15:18.952 "claim_type": "exclusive_write", 00:15:18.952 "zoned": false, 00:15:18.952 "supported_io_types": { 00:15:18.952 "read": true, 00:15:18.952 "write": true, 00:15:18.952 "unmap": true, 00:15:18.952 "flush": true, 00:15:18.952 "reset": true, 00:15:18.952 "nvme_admin": false, 00:15:18.952 "nvme_io": false, 00:15:18.952 "nvme_io_md": false, 00:15:18.952 "write_zeroes": true, 00:15:18.952 "zcopy": true, 00:15:18.952 "get_zone_info": false, 00:15:18.952 "zone_management": false, 00:15:18.952 "zone_append": false, 00:15:18.952 "compare": false, 00:15:18.952 "compare_and_write": false, 00:15:18.952 "abort": true, 00:15:18.952 "seek_hole": false, 00:15:18.952 "seek_data": false, 00:15:18.952 "copy": true, 00:15:18.952 "nvme_iov_md": false 00:15:18.952 }, 00:15:18.952 "memory_domains": [ 00:15:18.952 { 00:15:18.952 "dma_device_id": "system", 00:15:18.952 "dma_device_type": 1 00:15:18.952 }, 00:15:18.952 { 00:15:18.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.952 "dma_device_type": 2 00:15:18.952 } 00:15:18.952 ], 00:15:18.952 "driver_specific": {} 00:15:18.952 } 00:15:18.952 ] 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.952 10:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.210 10:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.210 "name": "Existed_Raid", 00:15:19.210 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:19.210 "strip_size_kb": 64, 00:15:19.210 "state": "configuring", 00:15:19.210 "raid_level": "concat", 00:15:19.210 "superblock": true, 00:15:19.210 "num_base_bdevs": 3, 00:15:19.210 "num_base_bdevs_discovered": 2, 00:15:19.210 "num_base_bdevs_operational": 3, 00:15:19.210 "base_bdevs_list": [ 00:15:19.210 { 00:15:19.210 "name": "BaseBdev1", 00:15:19.210 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:19.210 "is_configured": true, 00:15:19.210 "data_offset": 2048, 00:15:19.210 "data_size": 63488 00:15:19.210 }, 00:15:19.210 { 00:15:19.210 "name": null, 00:15:19.210 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:19.210 "is_configured": false, 00:15:19.210 "data_offset": 2048, 00:15:19.210 "data_size": 63488 00:15:19.210 }, 00:15:19.210 { 00:15:19.210 "name": "BaseBdev3", 00:15:19.210 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:19.210 "is_configured": true, 00:15:19.210 "data_offset": 2048, 00:15:19.210 "data_size": 63488 00:15:19.210 } 00:15:19.210 ] 00:15:19.210 }' 00:15:19.210 10:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.210 10:41:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:19.776 10:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.776 10:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:20.035 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:20.035 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:20.294 [2024-07-12 10:41:55.255889] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.294 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.552 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.552 "name": "Existed_Raid", 00:15:20.552 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:20.552 "strip_size_kb": 64, 00:15:20.552 "state": "configuring", 00:15:20.552 "raid_level": "concat", 00:15:20.552 "superblock": true, 00:15:20.552 "num_base_bdevs": 3, 00:15:20.552 "num_base_bdevs_discovered": 1, 00:15:20.552 "num_base_bdevs_operational": 3, 00:15:20.552 "base_bdevs_list": [ 00:15:20.552 { 00:15:20.552 "name": "BaseBdev1", 00:15:20.552 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:20.552 "is_configured": true, 00:15:20.552 "data_offset": 2048, 00:15:20.552 "data_size": 63488 00:15:20.552 }, 00:15:20.552 { 00:15:20.552 "name": null, 00:15:20.552 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:20.552 "is_configured": false, 00:15:20.552 "data_offset": 2048, 00:15:20.552 "data_size": 63488 00:15:20.552 }, 00:15:20.552 { 00:15:20.552 "name": null, 00:15:20.552 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:20.552 "is_configured": false, 00:15:20.552 "data_offset": 2048, 00:15:20.552 "data_size": 63488 00:15:20.552 } 00:15:20.552 ] 00:15:20.552 }' 00:15:20.552 10:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.552 10:41:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:21.117 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.117 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:21.376 [2024-07-12 10:41:56.515262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.376 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.635 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.635 "name": "Existed_Raid", 00:15:21.635 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:21.635 "strip_size_kb": 64, 00:15:21.635 "state": "configuring", 00:15:21.635 "raid_level": "concat", 00:15:21.635 "superblock": true, 00:15:21.635 "num_base_bdevs": 3, 00:15:21.635 "num_base_bdevs_discovered": 2, 00:15:21.635 "num_base_bdevs_operational": 3, 00:15:21.635 "base_bdevs_list": [ 00:15:21.635 { 00:15:21.635 "name": "BaseBdev1", 00:15:21.635 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:21.635 "is_configured": true, 00:15:21.635 "data_offset": 2048, 00:15:21.635 "data_size": 63488 00:15:21.635 }, 00:15:21.635 { 00:15:21.635 "name": null, 00:15:21.635 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:21.635 "is_configured": false, 00:15:21.635 "data_offset": 2048, 00:15:21.635 "data_size": 63488 00:15:21.635 }, 00:15:21.635 { 00:15:21.635 "name": "BaseBdev3", 00:15:21.635 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:21.635 "is_configured": true, 00:15:21.635 "data_offset": 2048, 00:15:21.635 "data_size": 63488 00:15:21.635 } 00:15:21.635 ] 00:15:21.635 }' 00:15:21.635 10:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.635 10:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.201 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.201 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:22.460 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:22.460 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:22.718 [2024-07-12 10:41:57.850816] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.718 10:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.976 10:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.976 "name": "Existed_Raid", 00:15:22.976 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:22.976 "strip_size_kb": 64, 00:15:22.976 "state": "configuring", 00:15:22.976 "raid_level": "concat", 00:15:22.976 "superblock": true, 00:15:22.976 "num_base_bdevs": 3, 00:15:22.976 "num_base_bdevs_discovered": 1, 00:15:22.976 "num_base_bdevs_operational": 3, 00:15:22.976 "base_bdevs_list": [ 00:15:22.976 { 00:15:22.976 "name": null, 00:15:22.976 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:22.976 "is_configured": false, 00:15:22.976 "data_offset": 2048, 00:15:22.976 "data_size": 63488 00:15:22.976 }, 00:15:22.976 { 00:15:22.976 "name": null, 00:15:22.976 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:22.976 "is_configured": false, 00:15:22.976 "data_offset": 2048, 00:15:22.976 "data_size": 63488 00:15:22.976 }, 00:15:22.976 { 00:15:22.976 "name": "BaseBdev3", 00:15:22.976 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:22.976 "is_configured": true, 00:15:22.976 "data_offset": 2048, 00:15:22.976 "data_size": 63488 00:15:22.976 } 00:15:22.976 ] 00:15:22.976 }' 00:15:22.976 10:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.976 10:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:23.542 10:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:23.542 10:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.800 10:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:23.800 10:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:24.057 [2024-07-12 10:41:59.188687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:24.057 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:24.057 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.058 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.316 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.316 "name": "Existed_Raid", 00:15:24.316 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:24.316 "strip_size_kb": 64, 00:15:24.316 "state": "configuring", 00:15:24.316 "raid_level": "concat", 00:15:24.316 "superblock": true, 00:15:24.316 "num_base_bdevs": 3, 00:15:24.316 "num_base_bdevs_discovered": 2, 00:15:24.316 "num_base_bdevs_operational": 3, 00:15:24.316 "base_bdevs_list": [ 00:15:24.316 { 00:15:24.316 "name": null, 00:15:24.316 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:24.316 "is_configured": false, 00:15:24.316 "data_offset": 2048, 00:15:24.316 "data_size": 63488 00:15:24.316 }, 00:15:24.316 { 00:15:24.316 "name": "BaseBdev2", 00:15:24.316 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:24.316 "is_configured": true, 00:15:24.316 "data_offset": 2048, 00:15:24.316 "data_size": 63488 00:15:24.316 }, 00:15:24.316 { 00:15:24.316 "name": "BaseBdev3", 00:15:24.316 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:24.316 "is_configured": true, 00:15:24.316 "data_offset": 2048, 00:15:24.316 "data_size": 63488 00:15:24.316 } 00:15:24.316 ] 00:15:24.316 }' 00:15:24.316 10:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.316 10:41:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:24.881 10:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.881 10:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:25.138 10:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:25.138 10:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.138 10:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:25.396 10:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496 00:15:25.654 [2024-07-12 10:42:00.624353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:25.654 [2024-07-12 10:42:00.624517] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x222bf50 00:15:25.654 [2024-07-12 10:42:00.624531] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:25.654 [2024-07-12 10:42:00.624708] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f32940 00:15:25.654 [2024-07-12 10:42:00.624821] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x222bf50 00:15:25.654 [2024-07-12 10:42:00.624831] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x222bf50 00:15:25.654 [2024-07-12 10:42:00.624923] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:25.654 NewBaseBdev 00:15:25.654 10:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:25.654 10:42:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:25.654 10:42:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:25.654 10:42:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:25.654 10:42:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:25.654 10:42:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:25.654 10:42:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:25.912 10:42:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:25.912 [ 00:15:25.912 { 00:15:25.912 "name": "NewBaseBdev", 00:15:25.912 "aliases": [ 00:15:25.912 "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496" 00:15:25.912 ], 00:15:25.912 "product_name": "Malloc disk", 00:15:25.912 "block_size": 512, 00:15:25.912 "num_blocks": 65536, 00:15:25.912 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:25.912 "assigned_rate_limits": { 00:15:25.912 "rw_ios_per_sec": 0, 00:15:25.912 "rw_mbytes_per_sec": 0, 00:15:25.912 "r_mbytes_per_sec": 0, 00:15:25.912 "w_mbytes_per_sec": 0 00:15:25.912 }, 00:15:25.912 "claimed": true, 00:15:25.912 "claim_type": "exclusive_write", 00:15:25.912 "zoned": false, 00:15:25.912 "supported_io_types": { 00:15:25.912 "read": true, 00:15:25.912 "write": true, 00:15:25.912 "unmap": true, 00:15:25.912 "flush": true, 00:15:25.912 "reset": true, 00:15:25.912 "nvme_admin": false, 00:15:25.912 "nvme_io": false, 00:15:25.912 "nvme_io_md": false, 00:15:25.912 "write_zeroes": true, 00:15:25.912 "zcopy": true, 00:15:25.912 "get_zone_info": false, 00:15:25.912 "zone_management": false, 00:15:25.912 "zone_append": false, 00:15:25.912 "compare": false, 00:15:25.912 "compare_and_write": false, 00:15:25.912 "abort": true, 00:15:25.912 "seek_hole": false, 00:15:25.912 "seek_data": false, 00:15:25.912 "copy": true, 00:15:25.912 "nvme_iov_md": false 00:15:25.912 }, 00:15:25.912 "memory_domains": [ 00:15:25.912 { 00:15:25.912 "dma_device_id": "system", 00:15:25.912 "dma_device_type": 1 00:15:25.912 }, 00:15:25.912 { 00:15:25.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.912 "dma_device_type": 2 00:15:25.912 } 00:15:25.912 ], 00:15:25.912 "driver_specific": {} 00:15:25.912 } 00:15:25.912 ] 00:15:26.169 10:42:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:26.169 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:26.169 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.169 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:26.169 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:26.169 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.169 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:26.169 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.170 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.170 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.170 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.170 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.170 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:26.426 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:26.427 "name": "Existed_Raid", 00:15:26.427 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:26.427 "strip_size_kb": 64, 00:15:26.427 "state": "online", 00:15:26.427 "raid_level": "concat", 00:15:26.427 "superblock": true, 00:15:26.427 "num_base_bdevs": 3, 00:15:26.427 "num_base_bdevs_discovered": 3, 00:15:26.427 "num_base_bdevs_operational": 3, 00:15:26.427 "base_bdevs_list": [ 00:15:26.427 { 00:15:26.427 "name": "NewBaseBdev", 00:15:26.427 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:26.427 "is_configured": true, 00:15:26.427 "data_offset": 2048, 00:15:26.427 "data_size": 63488 00:15:26.427 }, 00:15:26.427 { 00:15:26.427 "name": "BaseBdev2", 00:15:26.427 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:26.427 "is_configured": true, 00:15:26.427 "data_offset": 2048, 00:15:26.427 "data_size": 63488 00:15:26.427 }, 00:15:26.427 { 00:15:26.427 "name": "BaseBdev3", 00:15:26.427 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:26.427 "is_configured": true, 00:15:26.427 "data_offset": 2048, 00:15:26.427 "data_size": 63488 00:15:26.427 } 00:15:26.427 ] 00:15:26.427 }' 00:15:26.427 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:26.427 10:42:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:26.992 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:26.992 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:26.992 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:26.992 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:26.992 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:26.992 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:26.992 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:26.992 10:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:27.250 [2024-07-12 10:42:02.224905] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:27.250 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:27.251 "name": "Existed_Raid", 00:15:27.251 "aliases": [ 00:15:27.251 "af0fa177-335d-4f80-9362-d5ea53a37740" 00:15:27.251 ], 00:15:27.251 "product_name": "Raid Volume", 00:15:27.251 "block_size": 512, 00:15:27.251 "num_blocks": 190464, 00:15:27.251 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:27.251 "assigned_rate_limits": { 00:15:27.251 "rw_ios_per_sec": 0, 00:15:27.251 "rw_mbytes_per_sec": 0, 00:15:27.251 "r_mbytes_per_sec": 0, 00:15:27.251 "w_mbytes_per_sec": 0 00:15:27.251 }, 00:15:27.251 "claimed": false, 00:15:27.251 "zoned": false, 00:15:27.251 "supported_io_types": { 00:15:27.251 "read": true, 00:15:27.251 "write": true, 00:15:27.251 "unmap": true, 00:15:27.251 "flush": true, 00:15:27.251 "reset": true, 00:15:27.251 "nvme_admin": false, 00:15:27.251 "nvme_io": false, 00:15:27.251 "nvme_io_md": false, 00:15:27.251 "write_zeroes": true, 00:15:27.251 "zcopy": false, 00:15:27.251 "get_zone_info": false, 00:15:27.251 "zone_management": false, 00:15:27.251 "zone_append": false, 00:15:27.251 "compare": false, 00:15:27.251 "compare_and_write": false, 00:15:27.251 "abort": false, 00:15:27.251 "seek_hole": false, 00:15:27.251 "seek_data": false, 00:15:27.251 "copy": false, 00:15:27.251 "nvme_iov_md": false 00:15:27.251 }, 00:15:27.251 "memory_domains": [ 00:15:27.251 { 00:15:27.251 "dma_device_id": "system", 00:15:27.251 "dma_device_type": 1 00:15:27.251 }, 00:15:27.251 { 00:15:27.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.251 "dma_device_type": 2 00:15:27.251 }, 00:15:27.251 { 00:15:27.251 "dma_device_id": "system", 00:15:27.251 "dma_device_type": 1 00:15:27.251 }, 00:15:27.251 { 00:15:27.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.251 "dma_device_type": 2 00:15:27.251 }, 00:15:27.251 { 00:15:27.251 "dma_device_id": "system", 00:15:27.251 "dma_device_type": 1 00:15:27.251 }, 00:15:27.251 { 00:15:27.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.251 "dma_device_type": 2 00:15:27.251 } 00:15:27.251 ], 00:15:27.251 "driver_specific": { 00:15:27.251 "raid": { 00:15:27.251 "uuid": "af0fa177-335d-4f80-9362-d5ea53a37740", 00:15:27.251 "strip_size_kb": 64, 00:15:27.251 "state": "online", 00:15:27.251 "raid_level": "concat", 00:15:27.251 "superblock": true, 00:15:27.251 "num_base_bdevs": 3, 00:15:27.251 "num_base_bdevs_discovered": 3, 00:15:27.251 "num_base_bdevs_operational": 3, 00:15:27.251 "base_bdevs_list": [ 00:15:27.251 { 00:15:27.251 "name": "NewBaseBdev", 00:15:27.251 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:27.251 "is_configured": true, 00:15:27.251 "data_offset": 2048, 00:15:27.251 "data_size": 63488 00:15:27.251 }, 00:15:27.251 { 00:15:27.251 "name": "BaseBdev2", 00:15:27.251 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:27.251 "is_configured": true, 00:15:27.251 "data_offset": 2048, 00:15:27.251 "data_size": 63488 00:15:27.251 }, 00:15:27.251 { 00:15:27.251 "name": "BaseBdev3", 00:15:27.251 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:27.251 "is_configured": true, 00:15:27.251 "data_offset": 2048, 00:15:27.251 "data_size": 63488 00:15:27.251 } 00:15:27.251 ] 00:15:27.251 } 00:15:27.251 } 00:15:27.251 }' 00:15:27.251 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:27.251 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:27.251 BaseBdev2 00:15:27.251 BaseBdev3' 00:15:27.251 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.251 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:27.251 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:27.510 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:27.510 "name": "NewBaseBdev", 00:15:27.510 "aliases": [ 00:15:27.510 "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496" 00:15:27.510 ], 00:15:27.510 "product_name": "Malloc disk", 00:15:27.510 "block_size": 512, 00:15:27.510 "num_blocks": 65536, 00:15:27.510 "uuid": "44dc1eff-36d5-48aa-b0a0-b1fb8dc6a496", 00:15:27.510 "assigned_rate_limits": { 00:15:27.510 "rw_ios_per_sec": 0, 00:15:27.510 "rw_mbytes_per_sec": 0, 00:15:27.510 "r_mbytes_per_sec": 0, 00:15:27.510 "w_mbytes_per_sec": 0 00:15:27.510 }, 00:15:27.510 "claimed": true, 00:15:27.510 "claim_type": "exclusive_write", 00:15:27.510 "zoned": false, 00:15:27.510 "supported_io_types": { 00:15:27.510 "read": true, 00:15:27.510 "write": true, 00:15:27.510 "unmap": true, 00:15:27.510 "flush": true, 00:15:27.510 "reset": true, 00:15:27.510 "nvme_admin": false, 00:15:27.510 "nvme_io": false, 00:15:27.510 "nvme_io_md": false, 00:15:27.510 "write_zeroes": true, 00:15:27.510 "zcopy": true, 00:15:27.510 "get_zone_info": false, 00:15:27.510 "zone_management": false, 00:15:27.510 "zone_append": false, 00:15:27.510 "compare": false, 00:15:27.510 "compare_and_write": false, 00:15:27.510 "abort": true, 00:15:27.510 "seek_hole": false, 00:15:27.510 "seek_data": false, 00:15:27.510 "copy": true, 00:15:27.510 "nvme_iov_md": false 00:15:27.510 }, 00:15:27.510 "memory_domains": [ 00:15:27.510 { 00:15:27.510 "dma_device_id": "system", 00:15:27.510 "dma_device_type": 1 00:15:27.510 }, 00:15:27.510 { 00:15:27.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.510 "dma_device_type": 2 00:15:27.510 } 00:15:27.510 ], 00:15:27.510 "driver_specific": {} 00:15:27.510 }' 00:15:27.510 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.510 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.510 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:27.510 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.510 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.510 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.510 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.768 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.768 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.768 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.768 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.768 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.768 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.768 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:27.768 10:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:28.025 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:28.025 "name": "BaseBdev2", 00:15:28.025 "aliases": [ 00:15:28.025 "a3468fc8-3402-4b4e-b7c6-00986967c2eb" 00:15:28.025 ], 00:15:28.025 "product_name": "Malloc disk", 00:15:28.025 "block_size": 512, 00:15:28.025 "num_blocks": 65536, 00:15:28.025 "uuid": "a3468fc8-3402-4b4e-b7c6-00986967c2eb", 00:15:28.025 "assigned_rate_limits": { 00:15:28.025 "rw_ios_per_sec": 0, 00:15:28.025 "rw_mbytes_per_sec": 0, 00:15:28.025 "r_mbytes_per_sec": 0, 00:15:28.025 "w_mbytes_per_sec": 0 00:15:28.025 }, 00:15:28.025 "claimed": true, 00:15:28.025 "claim_type": "exclusive_write", 00:15:28.025 "zoned": false, 00:15:28.025 "supported_io_types": { 00:15:28.025 "read": true, 00:15:28.025 "write": true, 00:15:28.025 "unmap": true, 00:15:28.025 "flush": true, 00:15:28.025 "reset": true, 00:15:28.025 "nvme_admin": false, 00:15:28.025 "nvme_io": false, 00:15:28.025 "nvme_io_md": false, 00:15:28.025 "write_zeroes": true, 00:15:28.025 "zcopy": true, 00:15:28.025 "get_zone_info": false, 00:15:28.025 "zone_management": false, 00:15:28.025 "zone_append": false, 00:15:28.025 "compare": false, 00:15:28.025 "compare_and_write": false, 00:15:28.025 "abort": true, 00:15:28.025 "seek_hole": false, 00:15:28.025 "seek_data": false, 00:15:28.025 "copy": true, 00:15:28.025 "nvme_iov_md": false 00:15:28.025 }, 00:15:28.025 "memory_domains": [ 00:15:28.025 { 00:15:28.025 "dma_device_id": "system", 00:15:28.025 "dma_device_type": 1 00:15:28.025 }, 00:15:28.025 { 00:15:28.025 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.025 "dma_device_type": 2 00:15:28.025 } 00:15:28.025 ], 00:15:28.025 "driver_specific": {} 00:15:28.025 }' 00:15:28.025 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.025 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.025 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:28.025 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.025 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:28.281 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:28.538 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:28.538 "name": "BaseBdev3", 00:15:28.538 "aliases": [ 00:15:28.538 "8997ffca-f352-4691-b52c-b4de868b3fb7" 00:15:28.538 ], 00:15:28.538 "product_name": "Malloc disk", 00:15:28.538 "block_size": 512, 00:15:28.538 "num_blocks": 65536, 00:15:28.538 "uuid": "8997ffca-f352-4691-b52c-b4de868b3fb7", 00:15:28.539 "assigned_rate_limits": { 00:15:28.539 "rw_ios_per_sec": 0, 00:15:28.539 "rw_mbytes_per_sec": 0, 00:15:28.539 "r_mbytes_per_sec": 0, 00:15:28.539 "w_mbytes_per_sec": 0 00:15:28.539 }, 00:15:28.539 "claimed": true, 00:15:28.539 "claim_type": "exclusive_write", 00:15:28.539 "zoned": false, 00:15:28.539 "supported_io_types": { 00:15:28.539 "read": true, 00:15:28.539 "write": true, 00:15:28.539 "unmap": true, 00:15:28.539 "flush": true, 00:15:28.539 "reset": true, 00:15:28.539 "nvme_admin": false, 00:15:28.539 "nvme_io": false, 00:15:28.539 "nvme_io_md": false, 00:15:28.539 "write_zeroes": true, 00:15:28.539 "zcopy": true, 00:15:28.539 "get_zone_info": false, 00:15:28.539 "zone_management": false, 00:15:28.539 "zone_append": false, 00:15:28.539 "compare": false, 00:15:28.539 "compare_and_write": false, 00:15:28.539 "abort": true, 00:15:28.539 "seek_hole": false, 00:15:28.539 "seek_data": false, 00:15:28.539 "copy": true, 00:15:28.539 "nvme_iov_md": false 00:15:28.539 }, 00:15:28.539 "memory_domains": [ 00:15:28.539 { 00:15:28.539 "dma_device_id": "system", 00:15:28.539 "dma_device_type": 1 00:15:28.539 }, 00:15:28.539 { 00:15:28.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.539 "dma_device_type": 2 00:15:28.539 } 00:15:28.539 ], 00:15:28.539 "driver_specific": {} 00:15:28.539 }' 00:15:28.539 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.539 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:28.795 10:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:29.053 [2024-07-12 10:42:04.133704] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:29.053 [2024-07-12 10:42:04.133729] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:29.053 [2024-07-12 10:42:04.133780] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:29.053 [2024-07-12 10:42:04.133828] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:29.053 [2024-07-12 10:42:04.133840] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x222bf50 name Existed_Raid, state offline 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2052845 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2052845 ']' 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2052845 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2052845 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2052845' 00:15:29.053 killing process with pid 2052845 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2052845 00:15:29.053 [2024-07-12 10:42:04.203586] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:29.053 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2052845 00:15:29.053 [2024-07-12 10:42:04.231513] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:29.311 10:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:29.311 00:15:29.311 real 0m29.163s 00:15:29.311 user 0m53.487s 00:15:29.311 sys 0m5.247s 00:15:29.311 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:29.311 10:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:29.311 ************************************ 00:15:29.311 END TEST raid_state_function_test_sb 00:15:29.311 ************************************ 00:15:29.311 10:42:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:29.311 10:42:04 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:15:29.311 10:42:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:29.311 10:42:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:29.311 10:42:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:29.569 ************************************ 00:15:29.569 START TEST raid_superblock_test 00:15:29.569 ************************************ 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2057267 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2057267 /var/tmp/spdk-raid.sock 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2057267 ']' 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:29.569 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:29.569 10:42:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.569 [2024-07-12 10:42:04.592427] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:15:29.569 [2024-07-12 10:42:04.592505] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2057267 ] 00:15:29.569 [2024-07-12 10:42:04.722537] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:29.826 [2024-07-12 10:42:04.833875] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:29.826 [2024-07-12 10:42:04.898538] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:29.826 [2024-07-12 10:42:04.898562] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:30.392 10:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:30.956 malloc1 00:15:30.956 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:31.214 [2024-07-12 10:42:06.241340] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:31.214 [2024-07-12 10:42:06.241390] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:31.214 [2024-07-12 10:42:06.241414] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11b0570 00:15:31.214 [2024-07-12 10:42:06.241427] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:31.214 [2024-07-12 10:42:06.243183] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:31.214 [2024-07-12 10:42:06.243212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:31.214 pt1 00:15:31.214 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:31.214 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:31.214 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:31.214 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:31.214 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:31.214 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:31.214 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:31.214 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:31.214 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:31.472 malloc2 00:15:31.472 10:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:32.041 [2024-07-12 10:42:06.992128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:32.041 [2024-07-12 10:42:06.992173] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:32.041 [2024-07-12 10:42:06.992191] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11b1970 00:15:32.041 [2024-07-12 10:42:06.992209] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:32.041 [2024-07-12 10:42:06.993825] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:32.041 [2024-07-12 10:42:06.993853] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:32.041 pt2 00:15:32.041 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:32.041 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:32.041 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:32.041 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:32.041 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:32.041 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:32.041 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:32.041 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:32.041 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:32.298 malloc3 00:15:32.298 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:32.298 [2024-07-12 10:42:07.490805] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:32.298 [2024-07-12 10:42:07.490852] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:32.298 [2024-07-12 10:42:07.490868] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1348340 00:15:32.298 [2024-07-12 10:42:07.490881] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:32.298 [2024-07-12 10:42:07.492447] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:32.298 [2024-07-12 10:42:07.492475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:32.556 pt3 00:15:32.556 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:32.556 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:32.556 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:32.556 [2024-07-12 10:42:07.735478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:32.556 [2024-07-12 10:42:07.736851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:32.556 [2024-07-12 10:42:07.736907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:32.556 [2024-07-12 10:42:07.737055] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11a8ea0 00:15:32.556 [2024-07-12 10:42:07.737067] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:32.556 [2024-07-12 10:42:07.737269] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11b0240 00:15:32.556 [2024-07-12 10:42:07.737413] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11a8ea0 00:15:32.556 [2024-07-12 10:42:07.737423] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11a8ea0 00:15:32.556 [2024-07-12 10:42:07.737531] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:32.813 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:32.813 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:32.813 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:32.813 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.813 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.814 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.814 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.814 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.814 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.814 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.814 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.814 10:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:33.377 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.377 "name": "raid_bdev1", 00:15:33.377 "uuid": "7088c199-d114-45a6-ad94-fb34dcedcf28", 00:15:33.377 "strip_size_kb": 64, 00:15:33.377 "state": "online", 00:15:33.377 "raid_level": "concat", 00:15:33.377 "superblock": true, 00:15:33.377 "num_base_bdevs": 3, 00:15:33.377 "num_base_bdevs_discovered": 3, 00:15:33.377 "num_base_bdevs_operational": 3, 00:15:33.377 "base_bdevs_list": [ 00:15:33.377 { 00:15:33.377 "name": "pt1", 00:15:33.377 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:33.377 "is_configured": true, 00:15:33.377 "data_offset": 2048, 00:15:33.377 "data_size": 63488 00:15:33.377 }, 00:15:33.377 { 00:15:33.377 "name": "pt2", 00:15:33.377 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:33.377 "is_configured": true, 00:15:33.377 "data_offset": 2048, 00:15:33.377 "data_size": 63488 00:15:33.377 }, 00:15:33.377 { 00:15:33.377 "name": "pt3", 00:15:33.377 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:33.377 "is_configured": true, 00:15:33.377 "data_offset": 2048, 00:15:33.377 "data_size": 63488 00:15:33.377 } 00:15:33.377 ] 00:15:33.377 }' 00:15:33.377 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.377 10:42:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.978 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:33.978 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:33.978 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:33.978 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:33.978 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:33.978 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:33.978 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:33.978 10:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:33.978 [2024-07-12 10:42:09.079292] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:33.978 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:33.978 "name": "raid_bdev1", 00:15:33.978 "aliases": [ 00:15:33.978 "7088c199-d114-45a6-ad94-fb34dcedcf28" 00:15:33.978 ], 00:15:33.978 "product_name": "Raid Volume", 00:15:33.978 "block_size": 512, 00:15:33.978 "num_blocks": 190464, 00:15:33.978 "uuid": "7088c199-d114-45a6-ad94-fb34dcedcf28", 00:15:33.978 "assigned_rate_limits": { 00:15:33.978 "rw_ios_per_sec": 0, 00:15:33.978 "rw_mbytes_per_sec": 0, 00:15:33.978 "r_mbytes_per_sec": 0, 00:15:33.978 "w_mbytes_per_sec": 0 00:15:33.978 }, 00:15:33.978 "claimed": false, 00:15:33.978 "zoned": false, 00:15:33.978 "supported_io_types": { 00:15:33.978 "read": true, 00:15:33.978 "write": true, 00:15:33.978 "unmap": true, 00:15:33.978 "flush": true, 00:15:33.978 "reset": true, 00:15:33.978 "nvme_admin": false, 00:15:33.978 "nvme_io": false, 00:15:33.978 "nvme_io_md": false, 00:15:33.978 "write_zeroes": true, 00:15:33.978 "zcopy": false, 00:15:33.978 "get_zone_info": false, 00:15:33.978 "zone_management": false, 00:15:33.978 "zone_append": false, 00:15:33.978 "compare": false, 00:15:33.978 "compare_and_write": false, 00:15:33.978 "abort": false, 00:15:33.978 "seek_hole": false, 00:15:33.978 "seek_data": false, 00:15:33.978 "copy": false, 00:15:33.978 "nvme_iov_md": false 00:15:33.978 }, 00:15:33.978 "memory_domains": [ 00:15:33.978 { 00:15:33.979 "dma_device_id": "system", 00:15:33.979 "dma_device_type": 1 00:15:33.979 }, 00:15:33.979 { 00:15:33.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.979 "dma_device_type": 2 00:15:33.979 }, 00:15:33.979 { 00:15:33.979 "dma_device_id": "system", 00:15:33.979 "dma_device_type": 1 00:15:33.979 }, 00:15:33.979 { 00:15:33.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.979 "dma_device_type": 2 00:15:33.979 }, 00:15:33.979 { 00:15:33.979 "dma_device_id": "system", 00:15:33.979 "dma_device_type": 1 00:15:33.979 }, 00:15:33.979 { 00:15:33.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.979 "dma_device_type": 2 00:15:33.979 } 00:15:33.979 ], 00:15:33.979 "driver_specific": { 00:15:33.979 "raid": { 00:15:33.979 "uuid": "7088c199-d114-45a6-ad94-fb34dcedcf28", 00:15:33.979 "strip_size_kb": 64, 00:15:33.979 "state": "online", 00:15:33.979 "raid_level": "concat", 00:15:33.979 "superblock": true, 00:15:33.979 "num_base_bdevs": 3, 00:15:33.979 "num_base_bdevs_discovered": 3, 00:15:33.979 "num_base_bdevs_operational": 3, 00:15:33.979 "base_bdevs_list": [ 00:15:33.979 { 00:15:33.979 "name": "pt1", 00:15:33.979 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:33.979 "is_configured": true, 00:15:33.979 "data_offset": 2048, 00:15:33.979 "data_size": 63488 00:15:33.979 }, 00:15:33.979 { 00:15:33.979 "name": "pt2", 00:15:33.979 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:33.979 "is_configured": true, 00:15:33.979 "data_offset": 2048, 00:15:33.979 "data_size": 63488 00:15:33.979 }, 00:15:33.979 { 00:15:33.979 "name": "pt3", 00:15:33.979 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:33.979 "is_configured": true, 00:15:33.979 "data_offset": 2048, 00:15:33.979 "data_size": 63488 00:15:33.979 } 00:15:33.979 ] 00:15:33.979 } 00:15:33.979 } 00:15:33.979 }' 00:15:33.979 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:33.979 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:33.979 pt2 00:15:33.979 pt3' 00:15:33.979 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.979 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:33.979 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:34.237 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:34.237 "name": "pt1", 00:15:34.237 "aliases": [ 00:15:34.237 "00000000-0000-0000-0000-000000000001" 00:15:34.237 ], 00:15:34.237 "product_name": "passthru", 00:15:34.237 "block_size": 512, 00:15:34.237 "num_blocks": 65536, 00:15:34.237 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:34.237 "assigned_rate_limits": { 00:15:34.237 "rw_ios_per_sec": 0, 00:15:34.237 "rw_mbytes_per_sec": 0, 00:15:34.237 "r_mbytes_per_sec": 0, 00:15:34.237 "w_mbytes_per_sec": 0 00:15:34.237 }, 00:15:34.237 "claimed": true, 00:15:34.237 "claim_type": "exclusive_write", 00:15:34.237 "zoned": false, 00:15:34.237 "supported_io_types": { 00:15:34.237 "read": true, 00:15:34.237 "write": true, 00:15:34.237 "unmap": true, 00:15:34.237 "flush": true, 00:15:34.237 "reset": true, 00:15:34.237 "nvme_admin": false, 00:15:34.237 "nvme_io": false, 00:15:34.237 "nvme_io_md": false, 00:15:34.237 "write_zeroes": true, 00:15:34.237 "zcopy": true, 00:15:34.237 "get_zone_info": false, 00:15:34.237 "zone_management": false, 00:15:34.237 "zone_append": false, 00:15:34.237 "compare": false, 00:15:34.237 "compare_and_write": false, 00:15:34.237 "abort": true, 00:15:34.237 "seek_hole": false, 00:15:34.237 "seek_data": false, 00:15:34.237 "copy": true, 00:15:34.237 "nvme_iov_md": false 00:15:34.237 }, 00:15:34.237 "memory_domains": [ 00:15:34.237 { 00:15:34.237 "dma_device_id": "system", 00:15:34.237 "dma_device_type": 1 00:15:34.237 }, 00:15:34.237 { 00:15:34.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.237 "dma_device_type": 2 00:15:34.237 } 00:15:34.237 ], 00:15:34.237 "driver_specific": { 00:15:34.237 "passthru": { 00:15:34.237 "name": "pt1", 00:15:34.237 "base_bdev_name": "malloc1" 00:15:34.237 } 00:15:34.237 } 00:15:34.237 }' 00:15:34.237 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.495 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.495 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:34.495 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.495 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.495 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.495 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.495 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.495 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:34.495 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.752 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.752 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:34.752 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:34.752 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:34.752 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:35.009 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:35.009 "name": "pt2", 00:15:35.009 "aliases": [ 00:15:35.009 "00000000-0000-0000-0000-000000000002" 00:15:35.009 ], 00:15:35.009 "product_name": "passthru", 00:15:35.009 "block_size": 512, 00:15:35.009 "num_blocks": 65536, 00:15:35.009 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:35.009 "assigned_rate_limits": { 00:15:35.009 "rw_ios_per_sec": 0, 00:15:35.009 "rw_mbytes_per_sec": 0, 00:15:35.009 "r_mbytes_per_sec": 0, 00:15:35.009 "w_mbytes_per_sec": 0 00:15:35.009 }, 00:15:35.009 "claimed": true, 00:15:35.009 "claim_type": "exclusive_write", 00:15:35.009 "zoned": false, 00:15:35.009 "supported_io_types": { 00:15:35.009 "read": true, 00:15:35.009 "write": true, 00:15:35.009 "unmap": true, 00:15:35.009 "flush": true, 00:15:35.009 "reset": true, 00:15:35.009 "nvme_admin": false, 00:15:35.009 "nvme_io": false, 00:15:35.009 "nvme_io_md": false, 00:15:35.009 "write_zeroes": true, 00:15:35.009 "zcopy": true, 00:15:35.009 "get_zone_info": false, 00:15:35.009 "zone_management": false, 00:15:35.009 "zone_append": false, 00:15:35.009 "compare": false, 00:15:35.009 "compare_and_write": false, 00:15:35.009 "abort": true, 00:15:35.009 "seek_hole": false, 00:15:35.009 "seek_data": false, 00:15:35.009 "copy": true, 00:15:35.009 "nvme_iov_md": false 00:15:35.009 }, 00:15:35.009 "memory_domains": [ 00:15:35.009 { 00:15:35.009 "dma_device_id": "system", 00:15:35.009 "dma_device_type": 1 00:15:35.009 }, 00:15:35.009 { 00:15:35.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.009 "dma_device_type": 2 00:15:35.009 } 00:15:35.009 ], 00:15:35.009 "driver_specific": { 00:15:35.009 "passthru": { 00:15:35.009 "name": "pt2", 00:15:35.009 "base_bdev_name": "malloc2" 00:15:35.009 } 00:15:35.009 } 00:15:35.009 }' 00:15:35.009 10:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.009 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.009 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:35.009 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.009 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.009 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:35.009 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.267 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.267 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:35.267 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.267 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.267 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:35.267 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:35.267 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:35.267 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:35.526 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:35.526 "name": "pt3", 00:15:35.526 "aliases": [ 00:15:35.526 "00000000-0000-0000-0000-000000000003" 00:15:35.526 ], 00:15:35.526 "product_name": "passthru", 00:15:35.526 "block_size": 512, 00:15:35.526 "num_blocks": 65536, 00:15:35.526 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:35.526 "assigned_rate_limits": { 00:15:35.526 "rw_ios_per_sec": 0, 00:15:35.526 "rw_mbytes_per_sec": 0, 00:15:35.526 "r_mbytes_per_sec": 0, 00:15:35.526 "w_mbytes_per_sec": 0 00:15:35.526 }, 00:15:35.526 "claimed": true, 00:15:35.526 "claim_type": "exclusive_write", 00:15:35.526 "zoned": false, 00:15:35.526 "supported_io_types": { 00:15:35.526 "read": true, 00:15:35.526 "write": true, 00:15:35.526 "unmap": true, 00:15:35.526 "flush": true, 00:15:35.526 "reset": true, 00:15:35.526 "nvme_admin": false, 00:15:35.526 "nvme_io": false, 00:15:35.526 "nvme_io_md": false, 00:15:35.526 "write_zeroes": true, 00:15:35.526 "zcopy": true, 00:15:35.526 "get_zone_info": false, 00:15:35.526 "zone_management": false, 00:15:35.526 "zone_append": false, 00:15:35.526 "compare": false, 00:15:35.526 "compare_and_write": false, 00:15:35.526 "abort": true, 00:15:35.526 "seek_hole": false, 00:15:35.526 "seek_data": false, 00:15:35.526 "copy": true, 00:15:35.526 "nvme_iov_md": false 00:15:35.526 }, 00:15:35.526 "memory_domains": [ 00:15:35.526 { 00:15:35.526 "dma_device_id": "system", 00:15:35.526 "dma_device_type": 1 00:15:35.526 }, 00:15:35.526 { 00:15:35.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.526 "dma_device_type": 2 00:15:35.526 } 00:15:35.526 ], 00:15:35.526 "driver_specific": { 00:15:35.526 "passthru": { 00:15:35.526 "name": "pt3", 00:15:35.526 "base_bdev_name": "malloc3" 00:15:35.526 } 00:15:35.526 } 00:15:35.526 }' 00:15:35.526 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.526 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.526 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:35.526 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.526 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.526 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:35.526 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.784 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.784 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:35.784 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.784 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.784 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:35.784 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:35.784 10:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:36.042 [2024-07-12 10:42:11.108706] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:36.042 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7088c199-d114-45a6-ad94-fb34dcedcf28 00:15:36.042 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7088c199-d114-45a6-ad94-fb34dcedcf28 ']' 00:15:36.042 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:36.300 [2024-07-12 10:42:11.353053] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:36.300 [2024-07-12 10:42:11.353073] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:36.300 [2024-07-12 10:42:11.353121] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:36.300 [2024-07-12 10:42:11.353174] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:36.300 [2024-07-12 10:42:11.353186] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11a8ea0 name raid_bdev1, state offline 00:15:36.300 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.300 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:36.558 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:36.558 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:36.558 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:36.558 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:36.816 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:36.816 10:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:37.074 10:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:37.074 10:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:37.332 10:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:37.332 10:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:37.590 10:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:37.591 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:37.849 [2024-07-12 10:42:12.796890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:37.849 [2024-07-12 10:42:12.798274] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:37.849 [2024-07-12 10:42:12.798318] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:37.849 [2024-07-12 10:42:12.798362] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:37.849 [2024-07-12 10:42:12.798402] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:37.849 [2024-07-12 10:42:12.798424] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:37.849 [2024-07-12 10:42:12.798443] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:37.849 [2024-07-12 10:42:12.798452] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1353ff0 name raid_bdev1, state configuring 00:15:37.849 request: 00:15:37.849 { 00:15:37.849 "name": "raid_bdev1", 00:15:37.849 "raid_level": "concat", 00:15:37.849 "base_bdevs": [ 00:15:37.849 "malloc1", 00:15:37.849 "malloc2", 00:15:37.849 "malloc3" 00:15:37.849 ], 00:15:37.849 "strip_size_kb": 64, 00:15:37.849 "superblock": false, 00:15:37.849 "method": "bdev_raid_create", 00:15:37.849 "req_id": 1 00:15:37.849 } 00:15:37.849 Got JSON-RPC error response 00:15:37.849 response: 00:15:37.849 { 00:15:37.849 "code": -17, 00:15:37.849 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:37.849 } 00:15:37.849 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:37.849 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:37.849 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:37.849 10:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:37.849 10:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.849 10:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:38.107 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:38.107 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:38.107 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:38.107 [2024-07-12 10:42:13.286124] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:38.107 [2024-07-12 10:42:13.286166] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:38.107 [2024-07-12 10:42:13.286187] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11b07a0 00:15:38.107 [2024-07-12 10:42:13.286199] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:38.107 [2024-07-12 10:42:13.287780] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:38.107 [2024-07-12 10:42:13.287807] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:38.107 [2024-07-12 10:42:13.287872] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:38.107 [2024-07-12 10:42:13.287897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:38.107 pt1 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.365 "name": "raid_bdev1", 00:15:38.365 "uuid": "7088c199-d114-45a6-ad94-fb34dcedcf28", 00:15:38.365 "strip_size_kb": 64, 00:15:38.365 "state": "configuring", 00:15:38.365 "raid_level": "concat", 00:15:38.365 "superblock": true, 00:15:38.365 "num_base_bdevs": 3, 00:15:38.365 "num_base_bdevs_discovered": 1, 00:15:38.365 "num_base_bdevs_operational": 3, 00:15:38.365 "base_bdevs_list": [ 00:15:38.365 { 00:15:38.365 "name": "pt1", 00:15:38.365 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.365 "is_configured": true, 00:15:38.365 "data_offset": 2048, 00:15:38.365 "data_size": 63488 00:15:38.365 }, 00:15:38.365 { 00:15:38.365 "name": null, 00:15:38.365 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.365 "is_configured": false, 00:15:38.365 "data_offset": 2048, 00:15:38.365 "data_size": 63488 00:15:38.365 }, 00:15:38.365 { 00:15:38.365 "name": null, 00:15:38.365 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:38.365 "is_configured": false, 00:15:38.365 "data_offset": 2048, 00:15:38.365 "data_size": 63488 00:15:38.365 } 00:15:38.365 ] 00:15:38.365 }' 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.365 10:42:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.297 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:39.297 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:39.297 [2024-07-12 10:42:14.365025] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:39.297 [2024-07-12 10:42:14.365079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:39.297 [2024-07-12 10:42:14.365097] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11a7c70 00:15:39.297 [2024-07-12 10:42:14.365110] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:39.297 [2024-07-12 10:42:14.365448] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:39.297 [2024-07-12 10:42:14.365465] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:39.297 [2024-07-12 10:42:14.365541] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:39.297 [2024-07-12 10:42:14.365560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:39.297 pt2 00:15:39.297 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:39.555 [2024-07-12 10:42:14.613752] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.555 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:39.813 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:39.813 "name": "raid_bdev1", 00:15:39.813 "uuid": "7088c199-d114-45a6-ad94-fb34dcedcf28", 00:15:39.813 "strip_size_kb": 64, 00:15:39.813 "state": "configuring", 00:15:39.813 "raid_level": "concat", 00:15:39.813 "superblock": true, 00:15:39.813 "num_base_bdevs": 3, 00:15:39.813 "num_base_bdevs_discovered": 1, 00:15:39.813 "num_base_bdevs_operational": 3, 00:15:39.813 "base_bdevs_list": [ 00:15:39.813 { 00:15:39.813 "name": "pt1", 00:15:39.813 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:39.813 "is_configured": true, 00:15:39.813 "data_offset": 2048, 00:15:39.813 "data_size": 63488 00:15:39.813 }, 00:15:39.813 { 00:15:39.813 "name": null, 00:15:39.813 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:39.813 "is_configured": false, 00:15:39.813 "data_offset": 2048, 00:15:39.813 "data_size": 63488 00:15:39.813 }, 00:15:39.813 { 00:15:39.813 "name": null, 00:15:39.813 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:39.813 "is_configured": false, 00:15:39.813 "data_offset": 2048, 00:15:39.813 "data_size": 63488 00:15:39.813 } 00:15:39.813 ] 00:15:39.813 }' 00:15:39.813 10:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:39.813 10:42:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.378 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:40.378 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:40.378 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:40.636 [2024-07-12 10:42:15.648507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:40.636 [2024-07-12 10:42:15.648555] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.636 [2024-07-12 10:42:15.648581] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11b0a10 00:15:40.636 [2024-07-12 10:42:15.648593] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.636 [2024-07-12 10:42:15.648914] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.636 [2024-07-12 10:42:15.648930] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:40.636 [2024-07-12 10:42:15.648988] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:40.636 [2024-07-12 10:42:15.649006] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:40.636 pt2 00:15:40.636 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:40.636 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:40.636 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:40.895 [2024-07-12 10:42:15.897160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:40.895 [2024-07-12 10:42:15.897193] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.895 [2024-07-12 10:42:15.897208] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x134a740 00:15:40.895 [2024-07-12 10:42:15.897219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.895 [2024-07-12 10:42:15.897502] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.895 [2024-07-12 10:42:15.897519] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:40.895 [2024-07-12 10:42:15.897568] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:40.895 [2024-07-12 10:42:15.897584] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:40.895 [2024-07-12 10:42:15.897686] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x134ac00 00:15:40.895 [2024-07-12 10:42:15.897696] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:40.895 [2024-07-12 10:42:15.897859] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11afa40 00:15:40.895 [2024-07-12 10:42:15.897979] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x134ac00 00:15:40.895 [2024-07-12 10:42:15.897989] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x134ac00 00:15:40.895 [2024-07-12 10:42:15.898078] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:40.895 pt3 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.895 10:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:41.153 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.153 "name": "raid_bdev1", 00:15:41.153 "uuid": "7088c199-d114-45a6-ad94-fb34dcedcf28", 00:15:41.153 "strip_size_kb": 64, 00:15:41.153 "state": "online", 00:15:41.153 "raid_level": "concat", 00:15:41.153 "superblock": true, 00:15:41.153 "num_base_bdevs": 3, 00:15:41.153 "num_base_bdevs_discovered": 3, 00:15:41.153 "num_base_bdevs_operational": 3, 00:15:41.153 "base_bdevs_list": [ 00:15:41.153 { 00:15:41.153 "name": "pt1", 00:15:41.153 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:41.153 "is_configured": true, 00:15:41.153 "data_offset": 2048, 00:15:41.153 "data_size": 63488 00:15:41.153 }, 00:15:41.153 { 00:15:41.153 "name": "pt2", 00:15:41.153 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:41.153 "is_configured": true, 00:15:41.153 "data_offset": 2048, 00:15:41.153 "data_size": 63488 00:15:41.153 }, 00:15:41.153 { 00:15:41.153 "name": "pt3", 00:15:41.153 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:41.153 "is_configured": true, 00:15:41.153 "data_offset": 2048, 00:15:41.153 "data_size": 63488 00:15:41.153 } 00:15:41.153 ] 00:15:41.153 }' 00:15:41.153 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.153 10:42:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.719 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:41.719 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:41.719 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:41.719 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:41.719 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:41.719 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:41.719 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:41.719 10:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:41.977 [2024-07-12 10:42:16.996360] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:41.977 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:41.977 "name": "raid_bdev1", 00:15:41.977 "aliases": [ 00:15:41.977 "7088c199-d114-45a6-ad94-fb34dcedcf28" 00:15:41.977 ], 00:15:41.977 "product_name": "Raid Volume", 00:15:41.977 "block_size": 512, 00:15:41.977 "num_blocks": 190464, 00:15:41.977 "uuid": "7088c199-d114-45a6-ad94-fb34dcedcf28", 00:15:41.977 "assigned_rate_limits": { 00:15:41.977 "rw_ios_per_sec": 0, 00:15:41.977 "rw_mbytes_per_sec": 0, 00:15:41.977 "r_mbytes_per_sec": 0, 00:15:41.977 "w_mbytes_per_sec": 0 00:15:41.977 }, 00:15:41.977 "claimed": false, 00:15:41.977 "zoned": false, 00:15:41.977 "supported_io_types": { 00:15:41.977 "read": true, 00:15:41.977 "write": true, 00:15:41.977 "unmap": true, 00:15:41.977 "flush": true, 00:15:41.977 "reset": true, 00:15:41.977 "nvme_admin": false, 00:15:41.977 "nvme_io": false, 00:15:41.977 "nvme_io_md": false, 00:15:41.977 "write_zeroes": true, 00:15:41.977 "zcopy": false, 00:15:41.977 "get_zone_info": false, 00:15:41.977 "zone_management": false, 00:15:41.977 "zone_append": false, 00:15:41.977 "compare": false, 00:15:41.977 "compare_and_write": false, 00:15:41.977 "abort": false, 00:15:41.977 "seek_hole": false, 00:15:41.977 "seek_data": false, 00:15:41.977 "copy": false, 00:15:41.977 "nvme_iov_md": false 00:15:41.977 }, 00:15:41.977 "memory_domains": [ 00:15:41.977 { 00:15:41.977 "dma_device_id": "system", 00:15:41.977 "dma_device_type": 1 00:15:41.978 }, 00:15:41.978 { 00:15:41.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.978 "dma_device_type": 2 00:15:41.978 }, 00:15:41.978 { 00:15:41.978 "dma_device_id": "system", 00:15:41.978 "dma_device_type": 1 00:15:41.978 }, 00:15:41.978 { 00:15:41.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.978 "dma_device_type": 2 00:15:41.978 }, 00:15:41.978 { 00:15:41.978 "dma_device_id": "system", 00:15:41.978 "dma_device_type": 1 00:15:41.978 }, 00:15:41.978 { 00:15:41.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.978 "dma_device_type": 2 00:15:41.978 } 00:15:41.978 ], 00:15:41.978 "driver_specific": { 00:15:41.978 "raid": { 00:15:41.978 "uuid": "7088c199-d114-45a6-ad94-fb34dcedcf28", 00:15:41.978 "strip_size_kb": 64, 00:15:41.978 "state": "online", 00:15:41.978 "raid_level": "concat", 00:15:41.978 "superblock": true, 00:15:41.978 "num_base_bdevs": 3, 00:15:41.978 "num_base_bdevs_discovered": 3, 00:15:41.978 "num_base_bdevs_operational": 3, 00:15:41.978 "base_bdevs_list": [ 00:15:41.978 { 00:15:41.978 "name": "pt1", 00:15:41.978 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:41.978 "is_configured": true, 00:15:41.978 "data_offset": 2048, 00:15:41.978 "data_size": 63488 00:15:41.978 }, 00:15:41.978 { 00:15:41.978 "name": "pt2", 00:15:41.978 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:41.978 "is_configured": true, 00:15:41.978 "data_offset": 2048, 00:15:41.978 "data_size": 63488 00:15:41.978 }, 00:15:41.978 { 00:15:41.978 "name": "pt3", 00:15:41.978 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:41.978 "is_configured": true, 00:15:41.978 "data_offset": 2048, 00:15:41.978 "data_size": 63488 00:15:41.978 } 00:15:41.978 ] 00:15:41.978 } 00:15:41.978 } 00:15:41.978 }' 00:15:41.978 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:41.978 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:41.978 pt2 00:15:41.978 pt3' 00:15:41.978 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:41.978 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:41.978 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.236 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.236 "name": "pt1", 00:15:42.236 "aliases": [ 00:15:42.236 "00000000-0000-0000-0000-000000000001" 00:15:42.236 ], 00:15:42.236 "product_name": "passthru", 00:15:42.236 "block_size": 512, 00:15:42.236 "num_blocks": 65536, 00:15:42.236 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:42.236 "assigned_rate_limits": { 00:15:42.236 "rw_ios_per_sec": 0, 00:15:42.236 "rw_mbytes_per_sec": 0, 00:15:42.236 "r_mbytes_per_sec": 0, 00:15:42.236 "w_mbytes_per_sec": 0 00:15:42.236 }, 00:15:42.236 "claimed": true, 00:15:42.236 "claim_type": "exclusive_write", 00:15:42.236 "zoned": false, 00:15:42.236 "supported_io_types": { 00:15:42.236 "read": true, 00:15:42.236 "write": true, 00:15:42.236 "unmap": true, 00:15:42.236 "flush": true, 00:15:42.236 "reset": true, 00:15:42.236 "nvme_admin": false, 00:15:42.236 "nvme_io": false, 00:15:42.236 "nvme_io_md": false, 00:15:42.236 "write_zeroes": true, 00:15:42.236 "zcopy": true, 00:15:42.236 "get_zone_info": false, 00:15:42.236 "zone_management": false, 00:15:42.236 "zone_append": false, 00:15:42.236 "compare": false, 00:15:42.236 "compare_and_write": false, 00:15:42.236 "abort": true, 00:15:42.236 "seek_hole": false, 00:15:42.236 "seek_data": false, 00:15:42.236 "copy": true, 00:15:42.236 "nvme_iov_md": false 00:15:42.236 }, 00:15:42.236 "memory_domains": [ 00:15:42.236 { 00:15:42.236 "dma_device_id": "system", 00:15:42.236 "dma_device_type": 1 00:15:42.236 }, 00:15:42.236 { 00:15:42.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.236 "dma_device_type": 2 00:15:42.236 } 00:15:42.236 ], 00:15:42.236 "driver_specific": { 00:15:42.236 "passthru": { 00:15:42.236 "name": "pt1", 00:15:42.236 "base_bdev_name": "malloc1" 00:15:42.236 } 00:15:42.236 } 00:15:42.236 }' 00:15:42.236 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.236 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.236 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:42.236 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.236 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:42.494 10:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:43.060 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:43.060 "name": "pt2", 00:15:43.060 "aliases": [ 00:15:43.060 "00000000-0000-0000-0000-000000000002" 00:15:43.060 ], 00:15:43.060 "product_name": "passthru", 00:15:43.060 "block_size": 512, 00:15:43.060 "num_blocks": 65536, 00:15:43.060 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:43.060 "assigned_rate_limits": { 00:15:43.060 "rw_ios_per_sec": 0, 00:15:43.060 "rw_mbytes_per_sec": 0, 00:15:43.060 "r_mbytes_per_sec": 0, 00:15:43.060 "w_mbytes_per_sec": 0 00:15:43.060 }, 00:15:43.060 "claimed": true, 00:15:43.060 "claim_type": "exclusive_write", 00:15:43.060 "zoned": false, 00:15:43.060 "supported_io_types": { 00:15:43.060 "read": true, 00:15:43.060 "write": true, 00:15:43.060 "unmap": true, 00:15:43.060 "flush": true, 00:15:43.060 "reset": true, 00:15:43.060 "nvme_admin": false, 00:15:43.060 "nvme_io": false, 00:15:43.060 "nvme_io_md": false, 00:15:43.060 "write_zeroes": true, 00:15:43.060 "zcopy": true, 00:15:43.060 "get_zone_info": false, 00:15:43.060 "zone_management": false, 00:15:43.060 "zone_append": false, 00:15:43.060 "compare": false, 00:15:43.060 "compare_and_write": false, 00:15:43.060 "abort": true, 00:15:43.060 "seek_hole": false, 00:15:43.060 "seek_data": false, 00:15:43.060 "copy": true, 00:15:43.060 "nvme_iov_md": false 00:15:43.060 }, 00:15:43.060 "memory_domains": [ 00:15:43.060 { 00:15:43.060 "dma_device_id": "system", 00:15:43.060 "dma_device_type": 1 00:15:43.060 }, 00:15:43.060 { 00:15:43.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.060 "dma_device_type": 2 00:15:43.060 } 00:15:43.060 ], 00:15:43.060 "driver_specific": { 00:15:43.060 "passthru": { 00:15:43.060 "name": "pt2", 00:15:43.060 "base_bdev_name": "malloc2" 00:15:43.060 } 00:15:43.060 } 00:15:43.060 }' 00:15:43.060 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.060 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.060 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.060 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.060 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:43.317 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:43.574 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:43.574 "name": "pt3", 00:15:43.574 "aliases": [ 00:15:43.574 "00000000-0000-0000-0000-000000000003" 00:15:43.574 ], 00:15:43.574 "product_name": "passthru", 00:15:43.574 "block_size": 512, 00:15:43.574 "num_blocks": 65536, 00:15:43.574 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:43.574 "assigned_rate_limits": { 00:15:43.574 "rw_ios_per_sec": 0, 00:15:43.574 "rw_mbytes_per_sec": 0, 00:15:43.574 "r_mbytes_per_sec": 0, 00:15:43.574 "w_mbytes_per_sec": 0 00:15:43.574 }, 00:15:43.574 "claimed": true, 00:15:43.574 "claim_type": "exclusive_write", 00:15:43.574 "zoned": false, 00:15:43.574 "supported_io_types": { 00:15:43.574 "read": true, 00:15:43.574 "write": true, 00:15:43.574 "unmap": true, 00:15:43.574 "flush": true, 00:15:43.574 "reset": true, 00:15:43.574 "nvme_admin": false, 00:15:43.574 "nvme_io": false, 00:15:43.574 "nvme_io_md": false, 00:15:43.574 "write_zeroes": true, 00:15:43.574 "zcopy": true, 00:15:43.574 "get_zone_info": false, 00:15:43.574 "zone_management": false, 00:15:43.574 "zone_append": false, 00:15:43.574 "compare": false, 00:15:43.574 "compare_and_write": false, 00:15:43.574 "abort": true, 00:15:43.574 "seek_hole": false, 00:15:43.574 "seek_data": false, 00:15:43.574 "copy": true, 00:15:43.574 "nvme_iov_md": false 00:15:43.574 }, 00:15:43.574 "memory_domains": [ 00:15:43.574 { 00:15:43.574 "dma_device_id": "system", 00:15:43.574 "dma_device_type": 1 00:15:43.574 }, 00:15:43.574 { 00:15:43.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.574 "dma_device_type": 2 00:15:43.574 } 00:15:43.574 ], 00:15:43.574 "driver_specific": { 00:15:43.574 "passthru": { 00:15:43.574 "name": "pt3", 00:15:43.574 "base_bdev_name": "malloc3" 00:15:43.574 } 00:15:43.574 } 00:15:43.574 }' 00:15:43.574 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.831 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.831 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.831 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.831 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.831 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.831 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.831 10:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.831 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.831 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.089 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.089 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:44.089 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:44.089 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:44.346 [2024-07-12 10:42:19.338569] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7088c199-d114-45a6-ad94-fb34dcedcf28 '!=' 7088c199-d114-45a6-ad94-fb34dcedcf28 ']' 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2057267 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2057267 ']' 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2057267 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:44.346 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2057267 00:15:44.347 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:44.347 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:44.347 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2057267' 00:15:44.347 killing process with pid 2057267 00:15:44.347 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2057267 00:15:44.347 [2024-07-12 10:42:19.401554] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:44.347 [2024-07-12 10:42:19.401606] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:44.347 [2024-07-12 10:42:19.401667] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:44.347 [2024-07-12 10:42:19.401679] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x134ac00 name raid_bdev1, state offline 00:15:44.347 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2057267 00:15:44.347 [2024-07-12 10:42:19.429934] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:44.605 10:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:44.605 00:15:44.605 real 0m15.125s 00:15:44.605 user 0m27.329s 00:15:44.605 sys 0m2.650s 00:15:44.605 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:44.605 10:42:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.605 ************************************ 00:15:44.605 END TEST raid_superblock_test 00:15:44.605 ************************************ 00:15:44.605 10:42:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:44.605 10:42:19 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:15:44.605 10:42:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:44.605 10:42:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:44.605 10:42:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:44.605 ************************************ 00:15:44.605 START TEST raid_read_error_test 00:15:44.605 ************************************ 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.E2i8OIoD1e 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2059993 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2059993 /var/tmp/spdk-raid.sock 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2059993 ']' 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:44.605 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:44.605 10:42:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.605 [2024-07-12 10:42:19.794657] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:15:44.605 [2024-07-12 10:42:19.794720] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2059993 ] 00:15:44.865 [2024-07-12 10:42:19.921892] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:44.865 [2024-07-12 10:42:20.034058] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:45.124 [2024-07-12 10:42:20.092055] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:45.124 [2024-07-12 10:42:20.092085] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:45.692 10:42:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:45.692 10:42:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:45.692 10:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:45.692 10:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:45.951 BaseBdev1_malloc 00:15:45.951 10:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:46.210 true 00:15:46.210 10:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:46.210 [2024-07-12 10:42:21.348163] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:46.210 [2024-07-12 10:42:21.348210] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:46.210 [2024-07-12 10:42:21.348233] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18de0d0 00:15:46.210 [2024-07-12 10:42:21.348246] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:46.210 [2024-07-12 10:42:21.350140] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:46.210 [2024-07-12 10:42:21.350172] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:46.210 BaseBdev1 00:15:46.210 10:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:46.210 10:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:46.469 BaseBdev2_malloc 00:15:46.469 10:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:46.728 true 00:15:46.728 10:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:46.987 [2024-07-12 10:42:22.067895] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:46.987 [2024-07-12 10:42:22.067938] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:46.987 [2024-07-12 10:42:22.067958] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e2910 00:15:46.987 [2024-07-12 10:42:22.067971] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:46.987 [2024-07-12 10:42:22.069518] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:46.987 [2024-07-12 10:42:22.069548] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:46.987 BaseBdev2 00:15:46.987 10:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:46.987 10:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:47.246 BaseBdev3_malloc 00:15:47.246 10:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:47.505 true 00:15:47.505 10:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:47.764 [2024-07-12 10:42:22.802377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:47.764 [2024-07-12 10:42:22.802429] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.764 [2024-07-12 10:42:22.802452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e4bd0 00:15:47.764 [2024-07-12 10:42:22.802464] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.764 [2024-07-12 10:42:22.804060] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.764 [2024-07-12 10:42:22.804093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:47.764 BaseBdev3 00:15:47.764 10:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:48.021 [2024-07-12 10:42:23.043045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:48.021 [2024-07-12 10:42:23.044406] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:48.021 [2024-07-12 10:42:23.044475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:48.021 [2024-07-12 10:42:23.044689] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e6280 00:15:48.021 [2024-07-12 10:42:23.044701] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:48.021 [2024-07-12 10:42:23.044905] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e5e20 00:15:48.021 [2024-07-12 10:42:23.045052] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e6280 00:15:48.021 [2024-07-12 10:42:23.045061] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18e6280 00:15:48.021 [2024-07-12 10:42:23.045164] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.021 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:48.280 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.280 "name": "raid_bdev1", 00:15:48.280 "uuid": "9e545f57-06e9-433e-9151-6b4fdbca0a54", 00:15:48.280 "strip_size_kb": 64, 00:15:48.280 "state": "online", 00:15:48.280 "raid_level": "concat", 00:15:48.280 "superblock": true, 00:15:48.280 "num_base_bdevs": 3, 00:15:48.280 "num_base_bdevs_discovered": 3, 00:15:48.280 "num_base_bdevs_operational": 3, 00:15:48.280 "base_bdevs_list": [ 00:15:48.280 { 00:15:48.280 "name": "BaseBdev1", 00:15:48.280 "uuid": "3d49dc88-8f79-54b5-a1d2-766a09c6ffd0", 00:15:48.280 "is_configured": true, 00:15:48.280 "data_offset": 2048, 00:15:48.280 "data_size": 63488 00:15:48.280 }, 00:15:48.280 { 00:15:48.280 "name": "BaseBdev2", 00:15:48.280 "uuid": "7aeb38c5-5531-5820-8311-1b92ef01bcc8", 00:15:48.280 "is_configured": true, 00:15:48.280 "data_offset": 2048, 00:15:48.280 "data_size": 63488 00:15:48.280 }, 00:15:48.280 { 00:15:48.280 "name": "BaseBdev3", 00:15:48.280 "uuid": "a8d1172d-70c6-5c4a-a47e-74dee954b9ea", 00:15:48.280 "is_configured": true, 00:15:48.280 "data_offset": 2048, 00:15:48.280 "data_size": 63488 00:15:48.280 } 00:15:48.280 ] 00:15:48.280 }' 00:15:48.280 10:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.280 10:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.236 10:42:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:49.236 10:42:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:49.236 [2024-07-12 10:42:24.262568] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17344d0 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.172 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:50.431 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.431 "name": "raid_bdev1", 00:15:50.431 "uuid": "9e545f57-06e9-433e-9151-6b4fdbca0a54", 00:15:50.431 "strip_size_kb": 64, 00:15:50.431 "state": "online", 00:15:50.431 "raid_level": "concat", 00:15:50.431 "superblock": true, 00:15:50.431 "num_base_bdevs": 3, 00:15:50.431 "num_base_bdevs_discovered": 3, 00:15:50.431 "num_base_bdevs_operational": 3, 00:15:50.431 "base_bdevs_list": [ 00:15:50.431 { 00:15:50.431 "name": "BaseBdev1", 00:15:50.431 "uuid": "3d49dc88-8f79-54b5-a1d2-766a09c6ffd0", 00:15:50.431 "is_configured": true, 00:15:50.431 "data_offset": 2048, 00:15:50.431 "data_size": 63488 00:15:50.431 }, 00:15:50.431 { 00:15:50.431 "name": "BaseBdev2", 00:15:50.431 "uuid": "7aeb38c5-5531-5820-8311-1b92ef01bcc8", 00:15:50.431 "is_configured": true, 00:15:50.431 "data_offset": 2048, 00:15:50.431 "data_size": 63488 00:15:50.431 }, 00:15:50.431 { 00:15:50.431 "name": "BaseBdev3", 00:15:50.431 "uuid": "a8d1172d-70c6-5c4a-a47e-74dee954b9ea", 00:15:50.431 "is_configured": true, 00:15:50.431 "data_offset": 2048, 00:15:50.431 "data_size": 63488 00:15:50.431 } 00:15:50.431 ] 00:15:50.431 }' 00:15:50.431 10:42:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.431 10:42:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.997 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:51.254 [2024-07-12 10:42:26.301617] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:51.254 [2024-07-12 10:42:26.301660] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:51.254 [2024-07-12 10:42:26.304823] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:51.254 [2024-07-12 10:42:26.304861] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:51.254 [2024-07-12 10:42:26.304896] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:51.254 [2024-07-12 10:42:26.304912] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e6280 name raid_bdev1, state offline 00:15:51.254 0 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2059993 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2059993 ']' 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2059993 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2059993 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2059993' 00:15:51.254 killing process with pid 2059993 00:15:51.254 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2059993 00:15:51.254 [2024-07-12 10:42:26.368792] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:51.255 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2059993 00:15:51.255 [2024-07-12 10:42:26.390249] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.E2i8OIoD1e 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:15:51.513 00:15:51.513 real 0m6.892s 00:15:51.513 user 0m10.947s 00:15:51.513 sys 0m1.170s 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:51.513 10:42:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.513 ************************************ 00:15:51.513 END TEST raid_read_error_test 00:15:51.513 ************************************ 00:15:51.513 10:42:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:51.513 10:42:26 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:15:51.513 10:42:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:51.513 10:42:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:51.513 10:42:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:51.771 ************************************ 00:15:51.771 START TEST raid_write_error_test 00:15:51.771 ************************************ 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.twGssyBP2L 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2061012 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2061012 /var/tmp/spdk-raid.sock 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2061012 ']' 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:51.771 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:51.771 10:42:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.771 [2024-07-12 10:42:26.793747] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:15:51.771 [2024-07-12 10:42:26.793815] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2061012 ] 00:15:51.771 [2024-07-12 10:42:26.921984] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:52.029 [2024-07-12 10:42:27.029507] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:52.029 [2024-07-12 10:42:27.094444] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:52.029 [2024-07-12 10:42:27.094507] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:52.593 10:42:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:52.593 10:42:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:52.593 10:42:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:52.593 10:42:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:52.850 BaseBdev1_malloc 00:15:52.850 10:42:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:53.107 true 00:15:53.107 10:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:53.365 [2024-07-12 10:42:28.463199] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:53.365 [2024-07-12 10:42:28.463247] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:53.365 [2024-07-12 10:42:28.463267] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23240d0 00:15:53.365 [2024-07-12 10:42:28.463280] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:53.365 [2024-07-12 10:42:28.464982] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:53.365 [2024-07-12 10:42:28.465011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:53.365 BaseBdev1 00:15:53.365 10:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:53.365 10:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:53.623 BaseBdev2_malloc 00:15:53.623 10:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:53.880 true 00:15:53.880 10:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:54.137 [2024-07-12 10:42:29.141795] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:54.137 [2024-07-12 10:42:29.141841] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:54.137 [2024-07-12 10:42:29.141861] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2328910 00:15:54.137 [2024-07-12 10:42:29.141874] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:54.137 [2024-07-12 10:42:29.143411] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:54.137 [2024-07-12 10:42:29.143440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:54.137 BaseBdev2 00:15:54.137 10:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:54.137 10:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:54.395 BaseBdev3_malloc 00:15:54.395 10:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:54.395 true 00:15:54.395 10:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:54.654 [2024-07-12 10:42:29.816191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:54.654 [2024-07-12 10:42:29.816239] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:54.654 [2024-07-12 10:42:29.816257] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x232abd0 00:15:54.654 [2024-07-12 10:42:29.816270] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:54.654 [2024-07-12 10:42:29.817660] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:54.654 [2024-07-12 10:42:29.817688] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:54.654 BaseBdev3 00:15:54.654 10:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:54.913 [2024-07-12 10:42:30.060874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:54.913 [2024-07-12 10:42:30.062155] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:54.913 [2024-07-12 10:42:30.062225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:54.913 [2024-07-12 10:42:30.062432] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x232c280 00:15:54.913 [2024-07-12 10:42:30.062444] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:54.913 [2024-07-12 10:42:30.062646] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x232be20 00:15:54.913 [2024-07-12 10:42:30.062789] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x232c280 00:15:54.913 [2024-07-12 10:42:30.062800] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x232c280 00:15:54.913 [2024-07-12 10:42:30.062898] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.913 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:55.171 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.171 "name": "raid_bdev1", 00:15:55.171 "uuid": "782be906-f196-469c-8510-d679af3a959a", 00:15:55.171 "strip_size_kb": 64, 00:15:55.171 "state": "online", 00:15:55.171 "raid_level": "concat", 00:15:55.171 "superblock": true, 00:15:55.171 "num_base_bdevs": 3, 00:15:55.171 "num_base_bdevs_discovered": 3, 00:15:55.171 "num_base_bdevs_operational": 3, 00:15:55.171 "base_bdevs_list": [ 00:15:55.171 { 00:15:55.171 "name": "BaseBdev1", 00:15:55.171 "uuid": "4e32c666-6444-5d1e-a61a-4170974c4b31", 00:15:55.171 "is_configured": true, 00:15:55.171 "data_offset": 2048, 00:15:55.171 "data_size": 63488 00:15:55.171 }, 00:15:55.171 { 00:15:55.171 "name": "BaseBdev2", 00:15:55.171 "uuid": "e2d35803-ea0d-58d1-b009-0bf5cf171a7c", 00:15:55.171 "is_configured": true, 00:15:55.171 "data_offset": 2048, 00:15:55.171 "data_size": 63488 00:15:55.171 }, 00:15:55.171 { 00:15:55.171 "name": "BaseBdev3", 00:15:55.171 "uuid": "440e7078-7c92-5d6f-bd90-9c9e794f026d", 00:15:55.171 "is_configured": true, 00:15:55.171 "data_offset": 2048, 00:15:55.171 "data_size": 63488 00:15:55.171 } 00:15:55.171 ] 00:15:55.171 }' 00:15:55.171 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.171 10:42:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.738 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:55.738 10:42:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:55.738 [2024-07-12 10:42:30.919516] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217a4d0 00:15:56.673 10:42:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.930 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:57.188 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.188 "name": "raid_bdev1", 00:15:57.188 "uuid": "782be906-f196-469c-8510-d679af3a959a", 00:15:57.188 "strip_size_kb": 64, 00:15:57.188 "state": "online", 00:15:57.188 "raid_level": "concat", 00:15:57.188 "superblock": true, 00:15:57.188 "num_base_bdevs": 3, 00:15:57.188 "num_base_bdevs_discovered": 3, 00:15:57.188 "num_base_bdevs_operational": 3, 00:15:57.188 "base_bdevs_list": [ 00:15:57.188 { 00:15:57.188 "name": "BaseBdev1", 00:15:57.188 "uuid": "4e32c666-6444-5d1e-a61a-4170974c4b31", 00:15:57.188 "is_configured": true, 00:15:57.188 "data_offset": 2048, 00:15:57.188 "data_size": 63488 00:15:57.188 }, 00:15:57.188 { 00:15:57.188 "name": "BaseBdev2", 00:15:57.188 "uuid": "e2d35803-ea0d-58d1-b009-0bf5cf171a7c", 00:15:57.188 "is_configured": true, 00:15:57.188 "data_offset": 2048, 00:15:57.188 "data_size": 63488 00:15:57.188 }, 00:15:57.188 { 00:15:57.188 "name": "BaseBdev3", 00:15:57.188 "uuid": "440e7078-7c92-5d6f-bd90-9c9e794f026d", 00:15:57.188 "is_configured": true, 00:15:57.188 "data_offset": 2048, 00:15:57.188 "data_size": 63488 00:15:57.188 } 00:15:57.188 ] 00:15:57.188 }' 00:15:57.188 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.188 10:42:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.756 10:42:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:58.021 [2024-07-12 10:42:33.047046] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:58.021 [2024-07-12 10:42:33.047091] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:58.021 [2024-07-12 10:42:33.050410] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:58.021 [2024-07-12 10:42:33.050448] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:58.021 [2024-07-12 10:42:33.050492] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:58.021 [2024-07-12 10:42:33.050504] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x232c280 name raid_bdev1, state offline 00:15:58.021 0 00:15:58.021 10:42:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2061012 00:15:58.021 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2061012 ']' 00:15:58.021 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2061012 00:15:58.021 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:58.021 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:58.022 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2061012 00:15:58.022 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:58.022 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:58.022 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2061012' 00:15:58.022 killing process with pid 2061012 00:15:58.022 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2061012 00:15:58.022 [2024-07-12 10:42:33.114255] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:58.022 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2061012 00:15:58.022 [2024-07-12 10:42:33.135638] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.twGssyBP2L 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:15:58.282 00:15:58.282 real 0m6.660s 00:15:58.282 user 0m10.417s 00:15:58.282 sys 0m1.198s 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:58.282 10:42:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:58.282 ************************************ 00:15:58.282 END TEST raid_write_error_test 00:15:58.282 ************************************ 00:15:58.282 10:42:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:58.282 10:42:33 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:58.282 10:42:33 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:15:58.282 10:42:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:58.282 10:42:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:58.282 10:42:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:58.282 ************************************ 00:15:58.282 START TEST raid_state_function_test 00:15:58.282 ************************************ 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2061985 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2061985' 00:15:58.282 Process raid pid: 2061985 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2061985 /var/tmp/spdk-raid.sock 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2061985 ']' 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:58.282 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:58.282 10:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:58.541 [2024-07-12 10:42:33.515769] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:15:58.541 [2024-07-12 10:42:33.515836] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:58.541 [2024-07-12 10:42:33.646794] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:58.799 [2024-07-12 10:42:33.753769] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:58.799 [2024-07-12 10:42:33.818436] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:58.799 [2024-07-12 10:42:33.818503] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:59.365 10:42:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:59.365 10:42:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:59.365 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:59.623 [2024-07-12 10:42:34.678441] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:59.623 [2024-07-12 10:42:34.678489] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:59.623 [2024-07-12 10:42:34.678500] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:59.623 [2024-07-12 10:42:34.678512] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:59.623 [2024-07-12 10:42:34.678521] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:59.623 [2024-07-12 10:42:34.678532] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.623 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.882 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.882 "name": "Existed_Raid", 00:15:59.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.882 "strip_size_kb": 0, 00:15:59.882 "state": "configuring", 00:15:59.882 "raid_level": "raid1", 00:15:59.882 "superblock": false, 00:15:59.882 "num_base_bdevs": 3, 00:15:59.882 "num_base_bdevs_discovered": 0, 00:15:59.882 "num_base_bdevs_operational": 3, 00:15:59.882 "base_bdevs_list": [ 00:15:59.882 { 00:15:59.882 "name": "BaseBdev1", 00:15:59.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.882 "is_configured": false, 00:15:59.882 "data_offset": 0, 00:15:59.882 "data_size": 0 00:15:59.882 }, 00:15:59.882 { 00:15:59.882 "name": "BaseBdev2", 00:15:59.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.882 "is_configured": false, 00:15:59.882 "data_offset": 0, 00:15:59.882 "data_size": 0 00:15:59.882 }, 00:15:59.882 { 00:15:59.882 "name": "BaseBdev3", 00:15:59.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.882 "is_configured": false, 00:15:59.882 "data_offset": 0, 00:15:59.882 "data_size": 0 00:15:59.882 } 00:15:59.882 ] 00:15:59.882 }' 00:15:59.882 10:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.882 10:42:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.450 10:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:00.708 [2024-07-12 10:42:35.749162] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:00.708 [2024-07-12 10:42:35.749190] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x196aa80 name Existed_Raid, state configuring 00:16:00.708 10:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:00.965 [2024-07-12 10:42:35.993811] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:00.965 [2024-07-12 10:42:35.993838] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:00.965 [2024-07-12 10:42:35.993848] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:00.965 [2024-07-12 10:42:35.993859] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:00.965 [2024-07-12 10:42:35.993867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:00.965 [2024-07-12 10:42:35.993878] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:00.965 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:01.223 [2024-07-12 10:42:36.180096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:01.223 BaseBdev1 00:16:01.223 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:01.223 10:42:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:01.223 10:42:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:01.223 10:42:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:01.223 10:42:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:01.223 10:42:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:01.223 10:42:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:01.223 10:42:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:01.481 [ 00:16:01.481 { 00:16:01.481 "name": "BaseBdev1", 00:16:01.481 "aliases": [ 00:16:01.481 "53116ec5-5423-462d-a419-ee176a0da433" 00:16:01.481 ], 00:16:01.481 "product_name": "Malloc disk", 00:16:01.481 "block_size": 512, 00:16:01.481 "num_blocks": 65536, 00:16:01.481 "uuid": "53116ec5-5423-462d-a419-ee176a0da433", 00:16:01.481 "assigned_rate_limits": { 00:16:01.481 "rw_ios_per_sec": 0, 00:16:01.481 "rw_mbytes_per_sec": 0, 00:16:01.481 "r_mbytes_per_sec": 0, 00:16:01.481 "w_mbytes_per_sec": 0 00:16:01.481 }, 00:16:01.481 "claimed": true, 00:16:01.481 "claim_type": "exclusive_write", 00:16:01.481 "zoned": false, 00:16:01.481 "supported_io_types": { 00:16:01.481 "read": true, 00:16:01.481 "write": true, 00:16:01.481 "unmap": true, 00:16:01.481 "flush": true, 00:16:01.481 "reset": true, 00:16:01.481 "nvme_admin": false, 00:16:01.481 "nvme_io": false, 00:16:01.481 "nvme_io_md": false, 00:16:01.481 "write_zeroes": true, 00:16:01.481 "zcopy": true, 00:16:01.481 "get_zone_info": false, 00:16:01.481 "zone_management": false, 00:16:01.481 "zone_append": false, 00:16:01.481 "compare": false, 00:16:01.481 "compare_and_write": false, 00:16:01.481 "abort": true, 00:16:01.481 "seek_hole": false, 00:16:01.481 "seek_data": false, 00:16:01.481 "copy": true, 00:16:01.481 "nvme_iov_md": false 00:16:01.481 }, 00:16:01.481 "memory_domains": [ 00:16:01.481 { 00:16:01.481 "dma_device_id": "system", 00:16:01.481 "dma_device_type": 1 00:16:01.481 }, 00:16:01.481 { 00:16:01.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.481 "dma_device_type": 2 00:16:01.481 } 00:16:01.481 ], 00:16:01.481 "driver_specific": {} 00:16:01.481 } 00:16:01.481 ] 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.481 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.740 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.740 "name": "Existed_Raid", 00:16:01.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.740 "strip_size_kb": 0, 00:16:01.740 "state": "configuring", 00:16:01.740 "raid_level": "raid1", 00:16:01.740 "superblock": false, 00:16:01.740 "num_base_bdevs": 3, 00:16:01.740 "num_base_bdevs_discovered": 1, 00:16:01.740 "num_base_bdevs_operational": 3, 00:16:01.740 "base_bdevs_list": [ 00:16:01.740 { 00:16:01.740 "name": "BaseBdev1", 00:16:01.740 "uuid": "53116ec5-5423-462d-a419-ee176a0da433", 00:16:01.740 "is_configured": true, 00:16:01.740 "data_offset": 0, 00:16:01.740 "data_size": 65536 00:16:01.740 }, 00:16:01.740 { 00:16:01.740 "name": "BaseBdev2", 00:16:01.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.740 "is_configured": false, 00:16:01.740 "data_offset": 0, 00:16:01.740 "data_size": 0 00:16:01.740 }, 00:16:01.740 { 00:16:01.740 "name": "BaseBdev3", 00:16:01.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.740 "is_configured": false, 00:16:01.740 "data_offset": 0, 00:16:01.740 "data_size": 0 00:16:01.740 } 00:16:01.740 ] 00:16:01.740 }' 00:16:01.740 10:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.740 10:42:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.306 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:02.564 [2024-07-12 10:42:37.607871] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:02.564 [2024-07-12 10:42:37.607904] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x196a310 name Existed_Raid, state configuring 00:16:02.564 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:02.822 [2024-07-12 10:42:37.848544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:02.822 [2024-07-12 10:42:37.849970] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:02.822 [2024-07-12 10:42:37.850001] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:02.822 [2024-07-12 10:42:37.850010] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:02.822 [2024-07-12 10:42:37.850022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.822 10:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.081 10:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.081 "name": "Existed_Raid", 00:16:03.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.081 "strip_size_kb": 0, 00:16:03.081 "state": "configuring", 00:16:03.081 "raid_level": "raid1", 00:16:03.081 "superblock": false, 00:16:03.081 "num_base_bdevs": 3, 00:16:03.081 "num_base_bdevs_discovered": 1, 00:16:03.081 "num_base_bdevs_operational": 3, 00:16:03.081 "base_bdevs_list": [ 00:16:03.081 { 00:16:03.081 "name": "BaseBdev1", 00:16:03.081 "uuid": "53116ec5-5423-462d-a419-ee176a0da433", 00:16:03.081 "is_configured": true, 00:16:03.081 "data_offset": 0, 00:16:03.081 "data_size": 65536 00:16:03.081 }, 00:16:03.081 { 00:16:03.082 "name": "BaseBdev2", 00:16:03.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.082 "is_configured": false, 00:16:03.082 "data_offset": 0, 00:16:03.082 "data_size": 0 00:16:03.082 }, 00:16:03.082 { 00:16:03.082 "name": "BaseBdev3", 00:16:03.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.082 "is_configured": false, 00:16:03.082 "data_offset": 0, 00:16:03.082 "data_size": 0 00:16:03.082 } 00:16:03.082 ] 00:16:03.082 }' 00:16:03.082 10:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.082 10:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.650 10:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:03.909 [2024-07-12 10:42:38.939990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:03.909 BaseBdev2 00:16:03.909 10:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:03.909 10:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:03.909 10:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:03.909 10:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:03.909 10:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:03.909 10:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:03.909 10:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:04.168 10:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:04.428 [ 00:16:04.428 { 00:16:04.428 "name": "BaseBdev2", 00:16:04.428 "aliases": [ 00:16:04.428 "b5f57cb0-d2cb-4b54-b0c2-6edb2e3f8317" 00:16:04.428 ], 00:16:04.428 "product_name": "Malloc disk", 00:16:04.428 "block_size": 512, 00:16:04.428 "num_blocks": 65536, 00:16:04.428 "uuid": "b5f57cb0-d2cb-4b54-b0c2-6edb2e3f8317", 00:16:04.428 "assigned_rate_limits": { 00:16:04.428 "rw_ios_per_sec": 0, 00:16:04.428 "rw_mbytes_per_sec": 0, 00:16:04.428 "r_mbytes_per_sec": 0, 00:16:04.428 "w_mbytes_per_sec": 0 00:16:04.428 }, 00:16:04.428 "claimed": true, 00:16:04.428 "claim_type": "exclusive_write", 00:16:04.428 "zoned": false, 00:16:04.428 "supported_io_types": { 00:16:04.428 "read": true, 00:16:04.428 "write": true, 00:16:04.428 "unmap": true, 00:16:04.428 "flush": true, 00:16:04.428 "reset": true, 00:16:04.428 "nvme_admin": false, 00:16:04.428 "nvme_io": false, 00:16:04.428 "nvme_io_md": false, 00:16:04.428 "write_zeroes": true, 00:16:04.428 "zcopy": true, 00:16:04.428 "get_zone_info": false, 00:16:04.428 "zone_management": false, 00:16:04.428 "zone_append": false, 00:16:04.428 "compare": false, 00:16:04.428 "compare_and_write": false, 00:16:04.428 "abort": true, 00:16:04.428 "seek_hole": false, 00:16:04.428 "seek_data": false, 00:16:04.428 "copy": true, 00:16:04.428 "nvme_iov_md": false 00:16:04.428 }, 00:16:04.428 "memory_domains": [ 00:16:04.428 { 00:16:04.428 "dma_device_id": "system", 00:16:04.428 "dma_device_type": 1 00:16:04.428 }, 00:16:04.428 { 00:16:04.428 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.428 "dma_device_type": 2 00:16:04.428 } 00:16:04.428 ], 00:16:04.428 "driver_specific": {} 00:16:04.428 } 00:16:04.428 ] 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.428 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.688 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.688 "name": "Existed_Raid", 00:16:04.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.688 "strip_size_kb": 0, 00:16:04.688 "state": "configuring", 00:16:04.688 "raid_level": "raid1", 00:16:04.688 "superblock": false, 00:16:04.688 "num_base_bdevs": 3, 00:16:04.688 "num_base_bdevs_discovered": 2, 00:16:04.688 "num_base_bdevs_operational": 3, 00:16:04.688 "base_bdevs_list": [ 00:16:04.688 { 00:16:04.688 "name": "BaseBdev1", 00:16:04.688 "uuid": "53116ec5-5423-462d-a419-ee176a0da433", 00:16:04.688 "is_configured": true, 00:16:04.688 "data_offset": 0, 00:16:04.688 "data_size": 65536 00:16:04.688 }, 00:16:04.688 { 00:16:04.688 "name": "BaseBdev2", 00:16:04.688 "uuid": "b5f57cb0-d2cb-4b54-b0c2-6edb2e3f8317", 00:16:04.688 "is_configured": true, 00:16:04.688 "data_offset": 0, 00:16:04.688 "data_size": 65536 00:16:04.688 }, 00:16:04.688 { 00:16:04.688 "name": "BaseBdev3", 00:16:04.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.688 "is_configured": false, 00:16:04.688 "data_offset": 0, 00:16:04.688 "data_size": 0 00:16:04.688 } 00:16:04.688 ] 00:16:04.688 }' 00:16:04.688 10:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.688 10:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:05.255 10:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:05.514 [2024-07-12 10:42:40.507600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:05.514 [2024-07-12 10:42:40.507637] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x196b400 00:16:05.514 [2024-07-12 10:42:40.507646] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:05.514 [2024-07-12 10:42:40.507888] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x196aef0 00:16:05.514 [2024-07-12 10:42:40.508006] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x196b400 00:16:05.514 [2024-07-12 10:42:40.508016] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x196b400 00:16:05.514 [2024-07-12 10:42:40.508175] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:05.514 BaseBdev3 00:16:05.514 10:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:05.514 10:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:05.514 10:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:05.514 10:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:05.514 10:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:05.514 10:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:05.514 10:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:05.773 10:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:06.053 [ 00:16:06.053 { 00:16:06.053 "name": "BaseBdev3", 00:16:06.053 "aliases": [ 00:16:06.053 "16c7da83-ead9-4380-b095-e0fe722860a0" 00:16:06.053 ], 00:16:06.053 "product_name": "Malloc disk", 00:16:06.053 "block_size": 512, 00:16:06.053 "num_blocks": 65536, 00:16:06.053 "uuid": "16c7da83-ead9-4380-b095-e0fe722860a0", 00:16:06.053 "assigned_rate_limits": { 00:16:06.053 "rw_ios_per_sec": 0, 00:16:06.053 "rw_mbytes_per_sec": 0, 00:16:06.053 "r_mbytes_per_sec": 0, 00:16:06.053 "w_mbytes_per_sec": 0 00:16:06.053 }, 00:16:06.053 "claimed": true, 00:16:06.053 "claim_type": "exclusive_write", 00:16:06.053 "zoned": false, 00:16:06.053 "supported_io_types": { 00:16:06.053 "read": true, 00:16:06.053 "write": true, 00:16:06.053 "unmap": true, 00:16:06.053 "flush": true, 00:16:06.053 "reset": true, 00:16:06.053 "nvme_admin": false, 00:16:06.053 "nvme_io": false, 00:16:06.053 "nvme_io_md": false, 00:16:06.053 "write_zeroes": true, 00:16:06.053 "zcopy": true, 00:16:06.053 "get_zone_info": false, 00:16:06.053 "zone_management": false, 00:16:06.053 "zone_append": false, 00:16:06.053 "compare": false, 00:16:06.053 "compare_and_write": false, 00:16:06.053 "abort": true, 00:16:06.053 "seek_hole": false, 00:16:06.053 "seek_data": false, 00:16:06.053 "copy": true, 00:16:06.053 "nvme_iov_md": false 00:16:06.053 }, 00:16:06.053 "memory_domains": [ 00:16:06.053 { 00:16:06.053 "dma_device_id": "system", 00:16:06.053 "dma_device_type": 1 00:16:06.053 }, 00:16:06.053 { 00:16:06.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.053 "dma_device_type": 2 00:16:06.053 } 00:16:06.053 ], 00:16:06.053 "driver_specific": {} 00:16:06.053 } 00:16:06.053 ] 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.053 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.353 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.353 "name": "Existed_Raid", 00:16:06.353 "uuid": "b77042c8-0250-4147-b983-c1791a48f071", 00:16:06.353 "strip_size_kb": 0, 00:16:06.353 "state": "online", 00:16:06.353 "raid_level": "raid1", 00:16:06.353 "superblock": false, 00:16:06.353 "num_base_bdevs": 3, 00:16:06.353 "num_base_bdevs_discovered": 3, 00:16:06.353 "num_base_bdevs_operational": 3, 00:16:06.353 "base_bdevs_list": [ 00:16:06.353 { 00:16:06.353 "name": "BaseBdev1", 00:16:06.353 "uuid": "53116ec5-5423-462d-a419-ee176a0da433", 00:16:06.353 "is_configured": true, 00:16:06.353 "data_offset": 0, 00:16:06.353 "data_size": 65536 00:16:06.353 }, 00:16:06.353 { 00:16:06.353 "name": "BaseBdev2", 00:16:06.353 "uuid": "b5f57cb0-d2cb-4b54-b0c2-6edb2e3f8317", 00:16:06.353 "is_configured": true, 00:16:06.353 "data_offset": 0, 00:16:06.353 "data_size": 65536 00:16:06.353 }, 00:16:06.353 { 00:16:06.353 "name": "BaseBdev3", 00:16:06.353 "uuid": "16c7da83-ead9-4380-b095-e0fe722860a0", 00:16:06.353 "is_configured": true, 00:16:06.353 "data_offset": 0, 00:16:06.353 "data_size": 65536 00:16:06.353 } 00:16:06.353 ] 00:16:06.353 }' 00:16:06.353 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.353 10:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.919 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:06.919 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:06.919 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:06.919 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:06.919 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:06.919 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:06.919 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:06.919 10:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:06.919 [2024-07-12 10:42:42.007881] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:06.919 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:06.919 "name": "Existed_Raid", 00:16:06.919 "aliases": [ 00:16:06.919 "b77042c8-0250-4147-b983-c1791a48f071" 00:16:06.919 ], 00:16:06.919 "product_name": "Raid Volume", 00:16:06.919 "block_size": 512, 00:16:06.919 "num_blocks": 65536, 00:16:06.919 "uuid": "b77042c8-0250-4147-b983-c1791a48f071", 00:16:06.919 "assigned_rate_limits": { 00:16:06.919 "rw_ios_per_sec": 0, 00:16:06.919 "rw_mbytes_per_sec": 0, 00:16:06.919 "r_mbytes_per_sec": 0, 00:16:06.919 "w_mbytes_per_sec": 0 00:16:06.919 }, 00:16:06.919 "claimed": false, 00:16:06.919 "zoned": false, 00:16:06.919 "supported_io_types": { 00:16:06.919 "read": true, 00:16:06.919 "write": true, 00:16:06.919 "unmap": false, 00:16:06.919 "flush": false, 00:16:06.919 "reset": true, 00:16:06.919 "nvme_admin": false, 00:16:06.919 "nvme_io": false, 00:16:06.919 "nvme_io_md": false, 00:16:06.919 "write_zeroes": true, 00:16:06.919 "zcopy": false, 00:16:06.919 "get_zone_info": false, 00:16:06.919 "zone_management": false, 00:16:06.919 "zone_append": false, 00:16:06.919 "compare": false, 00:16:06.919 "compare_and_write": false, 00:16:06.919 "abort": false, 00:16:06.919 "seek_hole": false, 00:16:06.919 "seek_data": false, 00:16:06.919 "copy": false, 00:16:06.919 "nvme_iov_md": false 00:16:06.920 }, 00:16:06.920 "memory_domains": [ 00:16:06.920 { 00:16:06.920 "dma_device_id": "system", 00:16:06.920 "dma_device_type": 1 00:16:06.920 }, 00:16:06.920 { 00:16:06.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.920 "dma_device_type": 2 00:16:06.920 }, 00:16:06.920 { 00:16:06.920 "dma_device_id": "system", 00:16:06.920 "dma_device_type": 1 00:16:06.920 }, 00:16:06.920 { 00:16:06.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.920 "dma_device_type": 2 00:16:06.920 }, 00:16:06.920 { 00:16:06.920 "dma_device_id": "system", 00:16:06.920 "dma_device_type": 1 00:16:06.920 }, 00:16:06.920 { 00:16:06.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.920 "dma_device_type": 2 00:16:06.920 } 00:16:06.920 ], 00:16:06.920 "driver_specific": { 00:16:06.920 "raid": { 00:16:06.920 "uuid": "b77042c8-0250-4147-b983-c1791a48f071", 00:16:06.920 "strip_size_kb": 0, 00:16:06.920 "state": "online", 00:16:06.920 "raid_level": "raid1", 00:16:06.920 "superblock": false, 00:16:06.920 "num_base_bdevs": 3, 00:16:06.920 "num_base_bdevs_discovered": 3, 00:16:06.920 "num_base_bdevs_operational": 3, 00:16:06.920 "base_bdevs_list": [ 00:16:06.920 { 00:16:06.920 "name": "BaseBdev1", 00:16:06.920 "uuid": "53116ec5-5423-462d-a419-ee176a0da433", 00:16:06.920 "is_configured": true, 00:16:06.920 "data_offset": 0, 00:16:06.920 "data_size": 65536 00:16:06.920 }, 00:16:06.920 { 00:16:06.920 "name": "BaseBdev2", 00:16:06.920 "uuid": "b5f57cb0-d2cb-4b54-b0c2-6edb2e3f8317", 00:16:06.920 "is_configured": true, 00:16:06.920 "data_offset": 0, 00:16:06.920 "data_size": 65536 00:16:06.920 }, 00:16:06.920 { 00:16:06.920 "name": "BaseBdev3", 00:16:06.920 "uuid": "16c7da83-ead9-4380-b095-e0fe722860a0", 00:16:06.920 "is_configured": true, 00:16:06.920 "data_offset": 0, 00:16:06.920 "data_size": 65536 00:16:06.920 } 00:16:06.920 ] 00:16:06.920 } 00:16:06.920 } 00:16:06.920 }' 00:16:06.920 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:06.920 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:06.920 BaseBdev2 00:16:06.920 BaseBdev3' 00:16:06.920 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.920 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:06.920 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.177 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.177 "name": "BaseBdev1", 00:16:07.177 "aliases": [ 00:16:07.177 "53116ec5-5423-462d-a419-ee176a0da433" 00:16:07.177 ], 00:16:07.177 "product_name": "Malloc disk", 00:16:07.177 "block_size": 512, 00:16:07.177 "num_blocks": 65536, 00:16:07.177 "uuid": "53116ec5-5423-462d-a419-ee176a0da433", 00:16:07.177 "assigned_rate_limits": { 00:16:07.177 "rw_ios_per_sec": 0, 00:16:07.177 "rw_mbytes_per_sec": 0, 00:16:07.177 "r_mbytes_per_sec": 0, 00:16:07.177 "w_mbytes_per_sec": 0 00:16:07.177 }, 00:16:07.177 "claimed": true, 00:16:07.177 "claim_type": "exclusive_write", 00:16:07.177 "zoned": false, 00:16:07.177 "supported_io_types": { 00:16:07.177 "read": true, 00:16:07.177 "write": true, 00:16:07.178 "unmap": true, 00:16:07.178 "flush": true, 00:16:07.178 "reset": true, 00:16:07.178 "nvme_admin": false, 00:16:07.178 "nvme_io": false, 00:16:07.178 "nvme_io_md": false, 00:16:07.178 "write_zeroes": true, 00:16:07.178 "zcopy": true, 00:16:07.178 "get_zone_info": false, 00:16:07.178 "zone_management": false, 00:16:07.178 "zone_append": false, 00:16:07.178 "compare": false, 00:16:07.178 "compare_and_write": false, 00:16:07.178 "abort": true, 00:16:07.178 "seek_hole": false, 00:16:07.178 "seek_data": false, 00:16:07.178 "copy": true, 00:16:07.178 "nvme_iov_md": false 00:16:07.178 }, 00:16:07.178 "memory_domains": [ 00:16:07.178 { 00:16:07.178 "dma_device_id": "system", 00:16:07.178 "dma_device_type": 1 00:16:07.178 }, 00:16:07.178 { 00:16:07.178 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.178 "dma_device_type": 2 00:16:07.178 } 00:16:07.178 ], 00:16:07.178 "driver_specific": {} 00:16:07.178 }' 00:16:07.178 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.178 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.435 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.435 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.435 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.435 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.435 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.436 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.436 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.436 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.436 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.693 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.693 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.693 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:07.693 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.952 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.952 "name": "BaseBdev2", 00:16:07.952 "aliases": [ 00:16:07.952 "b5f57cb0-d2cb-4b54-b0c2-6edb2e3f8317" 00:16:07.952 ], 00:16:07.952 "product_name": "Malloc disk", 00:16:07.952 "block_size": 512, 00:16:07.952 "num_blocks": 65536, 00:16:07.952 "uuid": "b5f57cb0-d2cb-4b54-b0c2-6edb2e3f8317", 00:16:07.952 "assigned_rate_limits": { 00:16:07.952 "rw_ios_per_sec": 0, 00:16:07.952 "rw_mbytes_per_sec": 0, 00:16:07.952 "r_mbytes_per_sec": 0, 00:16:07.952 "w_mbytes_per_sec": 0 00:16:07.952 }, 00:16:07.952 "claimed": true, 00:16:07.952 "claim_type": "exclusive_write", 00:16:07.952 "zoned": false, 00:16:07.952 "supported_io_types": { 00:16:07.952 "read": true, 00:16:07.952 "write": true, 00:16:07.952 "unmap": true, 00:16:07.952 "flush": true, 00:16:07.952 "reset": true, 00:16:07.952 "nvme_admin": false, 00:16:07.952 "nvme_io": false, 00:16:07.952 "nvme_io_md": false, 00:16:07.952 "write_zeroes": true, 00:16:07.952 "zcopy": true, 00:16:07.952 "get_zone_info": false, 00:16:07.952 "zone_management": false, 00:16:07.952 "zone_append": false, 00:16:07.952 "compare": false, 00:16:07.952 "compare_and_write": false, 00:16:07.952 "abort": true, 00:16:07.952 "seek_hole": false, 00:16:07.952 "seek_data": false, 00:16:07.952 "copy": true, 00:16:07.952 "nvme_iov_md": false 00:16:07.952 }, 00:16:07.952 "memory_domains": [ 00:16:07.952 { 00:16:07.952 "dma_device_id": "system", 00:16:07.952 "dma_device_type": 1 00:16:07.952 }, 00:16:07.952 { 00:16:07.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.952 "dma_device_type": 2 00:16:07.952 } 00:16:07.952 ], 00:16:07.952 "driver_specific": {} 00:16:07.952 }' 00:16:07.952 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.952 10:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.952 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.952 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.952 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.952 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.952 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.952 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.211 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:08.211 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.211 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.211 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:08.211 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:08.211 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:08.211 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:08.468 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:08.468 "name": "BaseBdev3", 00:16:08.468 "aliases": [ 00:16:08.468 "16c7da83-ead9-4380-b095-e0fe722860a0" 00:16:08.468 ], 00:16:08.468 "product_name": "Malloc disk", 00:16:08.468 "block_size": 512, 00:16:08.468 "num_blocks": 65536, 00:16:08.468 "uuid": "16c7da83-ead9-4380-b095-e0fe722860a0", 00:16:08.468 "assigned_rate_limits": { 00:16:08.468 "rw_ios_per_sec": 0, 00:16:08.468 "rw_mbytes_per_sec": 0, 00:16:08.468 "r_mbytes_per_sec": 0, 00:16:08.468 "w_mbytes_per_sec": 0 00:16:08.468 }, 00:16:08.468 "claimed": true, 00:16:08.468 "claim_type": "exclusive_write", 00:16:08.468 "zoned": false, 00:16:08.468 "supported_io_types": { 00:16:08.468 "read": true, 00:16:08.468 "write": true, 00:16:08.468 "unmap": true, 00:16:08.468 "flush": true, 00:16:08.468 "reset": true, 00:16:08.468 "nvme_admin": false, 00:16:08.468 "nvme_io": false, 00:16:08.468 "nvme_io_md": false, 00:16:08.468 "write_zeroes": true, 00:16:08.468 "zcopy": true, 00:16:08.468 "get_zone_info": false, 00:16:08.468 "zone_management": false, 00:16:08.468 "zone_append": false, 00:16:08.468 "compare": false, 00:16:08.468 "compare_and_write": false, 00:16:08.468 "abort": true, 00:16:08.468 "seek_hole": false, 00:16:08.468 "seek_data": false, 00:16:08.468 "copy": true, 00:16:08.468 "nvme_iov_md": false 00:16:08.468 }, 00:16:08.468 "memory_domains": [ 00:16:08.468 { 00:16:08.468 "dma_device_id": "system", 00:16:08.468 "dma_device_type": 1 00:16:08.468 }, 00:16:08.468 { 00:16:08.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.468 "dma_device_type": 2 00:16:08.468 } 00:16:08.468 ], 00:16:08.468 "driver_specific": {} 00:16:08.468 }' 00:16:08.468 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.468 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.468 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:08.468 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.468 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.726 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:08.726 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.726 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.726 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:08.726 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.726 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.726 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:08.726 10:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:08.984 [2024-07-12 10:42:44.117209] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.984 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.243 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.243 "name": "Existed_Raid", 00:16:09.243 "uuid": "b77042c8-0250-4147-b983-c1791a48f071", 00:16:09.243 "strip_size_kb": 0, 00:16:09.243 "state": "online", 00:16:09.243 "raid_level": "raid1", 00:16:09.243 "superblock": false, 00:16:09.243 "num_base_bdevs": 3, 00:16:09.243 "num_base_bdevs_discovered": 2, 00:16:09.243 "num_base_bdevs_operational": 2, 00:16:09.243 "base_bdevs_list": [ 00:16:09.243 { 00:16:09.243 "name": null, 00:16:09.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.243 "is_configured": false, 00:16:09.243 "data_offset": 0, 00:16:09.243 "data_size": 65536 00:16:09.243 }, 00:16:09.243 { 00:16:09.243 "name": "BaseBdev2", 00:16:09.243 "uuid": "b5f57cb0-d2cb-4b54-b0c2-6edb2e3f8317", 00:16:09.243 "is_configured": true, 00:16:09.243 "data_offset": 0, 00:16:09.243 "data_size": 65536 00:16:09.243 }, 00:16:09.243 { 00:16:09.243 "name": "BaseBdev3", 00:16:09.243 "uuid": "16c7da83-ead9-4380-b095-e0fe722860a0", 00:16:09.243 "is_configured": true, 00:16:09.243 "data_offset": 0, 00:16:09.243 "data_size": 65536 00:16:09.243 } 00:16:09.243 ] 00:16:09.243 }' 00:16:09.243 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.243 10:42:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.811 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:09.811 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:09.811 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.811 10:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:10.069 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:10.069 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:10.069 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:10.328 [2024-07-12 10:42:45.366421] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:10.328 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:10.328 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:10.328 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.328 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:10.587 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:10.587 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:10.587 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:10.846 [2024-07-12 10:42:45.868252] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:10.846 [2024-07-12 10:42:45.868334] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:10.846 [2024-07-12 10:42:45.880986] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:10.846 [2024-07-12 10:42:45.881028] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:10.846 [2024-07-12 10:42:45.881040] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x196b400 name Existed_Raid, state offline 00:16:10.846 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:10.846 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:10.846 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.846 10:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:11.104 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:11.104 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:11.104 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:11.104 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:11.104 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:11.104 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:11.362 BaseBdev2 00:16:11.362 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:11.362 10:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:11.363 10:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:11.363 10:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:11.363 10:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:11.363 10:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:11.363 10:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.621 10:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:11.879 [ 00:16:11.879 { 00:16:11.879 "name": "BaseBdev2", 00:16:11.879 "aliases": [ 00:16:11.879 "250127b3-0d7a-4296-90a2-3d90ca40a510" 00:16:11.879 ], 00:16:11.879 "product_name": "Malloc disk", 00:16:11.879 "block_size": 512, 00:16:11.879 "num_blocks": 65536, 00:16:11.879 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:11.879 "assigned_rate_limits": { 00:16:11.879 "rw_ios_per_sec": 0, 00:16:11.879 "rw_mbytes_per_sec": 0, 00:16:11.879 "r_mbytes_per_sec": 0, 00:16:11.879 "w_mbytes_per_sec": 0 00:16:11.879 }, 00:16:11.879 "claimed": false, 00:16:11.879 "zoned": false, 00:16:11.879 "supported_io_types": { 00:16:11.879 "read": true, 00:16:11.879 "write": true, 00:16:11.879 "unmap": true, 00:16:11.879 "flush": true, 00:16:11.879 "reset": true, 00:16:11.879 "nvme_admin": false, 00:16:11.879 "nvme_io": false, 00:16:11.879 "nvme_io_md": false, 00:16:11.879 "write_zeroes": true, 00:16:11.879 "zcopy": true, 00:16:11.879 "get_zone_info": false, 00:16:11.879 "zone_management": false, 00:16:11.879 "zone_append": false, 00:16:11.879 "compare": false, 00:16:11.879 "compare_and_write": false, 00:16:11.879 "abort": true, 00:16:11.879 "seek_hole": false, 00:16:11.879 "seek_data": false, 00:16:11.879 "copy": true, 00:16:11.879 "nvme_iov_md": false 00:16:11.879 }, 00:16:11.879 "memory_domains": [ 00:16:11.879 { 00:16:11.879 "dma_device_id": "system", 00:16:11.879 "dma_device_type": 1 00:16:11.879 }, 00:16:11.879 { 00:16:11.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.879 "dma_device_type": 2 00:16:11.879 } 00:16:11.879 ], 00:16:11.879 "driver_specific": {} 00:16:11.879 } 00:16:11.879 ] 00:16:11.879 10:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:11.879 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:11.879 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:11.880 10:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:12.138 BaseBdev3 00:16:12.139 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:12.139 10:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:12.139 10:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:12.139 10:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:12.139 10:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:12.139 10:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:12.139 10:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:12.398 10:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:12.657 [ 00:16:12.657 { 00:16:12.657 "name": "BaseBdev3", 00:16:12.657 "aliases": [ 00:16:12.657 "c2cf05e3-86ff-496b-94b4-054f141ec5ed" 00:16:12.657 ], 00:16:12.657 "product_name": "Malloc disk", 00:16:12.657 "block_size": 512, 00:16:12.657 "num_blocks": 65536, 00:16:12.657 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:12.657 "assigned_rate_limits": { 00:16:12.657 "rw_ios_per_sec": 0, 00:16:12.657 "rw_mbytes_per_sec": 0, 00:16:12.657 "r_mbytes_per_sec": 0, 00:16:12.657 "w_mbytes_per_sec": 0 00:16:12.657 }, 00:16:12.657 "claimed": false, 00:16:12.657 "zoned": false, 00:16:12.657 "supported_io_types": { 00:16:12.657 "read": true, 00:16:12.657 "write": true, 00:16:12.657 "unmap": true, 00:16:12.657 "flush": true, 00:16:12.657 "reset": true, 00:16:12.657 "nvme_admin": false, 00:16:12.657 "nvme_io": false, 00:16:12.657 "nvme_io_md": false, 00:16:12.657 "write_zeroes": true, 00:16:12.657 "zcopy": true, 00:16:12.657 "get_zone_info": false, 00:16:12.657 "zone_management": false, 00:16:12.657 "zone_append": false, 00:16:12.657 "compare": false, 00:16:12.657 "compare_and_write": false, 00:16:12.657 "abort": true, 00:16:12.657 "seek_hole": false, 00:16:12.657 "seek_data": false, 00:16:12.657 "copy": true, 00:16:12.657 "nvme_iov_md": false 00:16:12.657 }, 00:16:12.657 "memory_domains": [ 00:16:12.657 { 00:16:12.657 "dma_device_id": "system", 00:16:12.657 "dma_device_type": 1 00:16:12.657 }, 00:16:12.657 { 00:16:12.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.657 "dma_device_type": 2 00:16:12.657 } 00:16:12.657 ], 00:16:12.657 "driver_specific": {} 00:16:12.657 } 00:16:12.657 ] 00:16:12.657 10:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:12.657 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:12.657 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:12.657 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:12.657 [2024-07-12 10:42:47.839753] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:12.657 [2024-07-12 10:42:47.839795] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:12.657 [2024-07-12 10:42:47.839812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:12.657 [2024-07-12 10:42:47.841170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.916 10:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.916 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.916 "name": "Existed_Raid", 00:16:12.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.916 "strip_size_kb": 0, 00:16:12.916 "state": "configuring", 00:16:12.916 "raid_level": "raid1", 00:16:12.916 "superblock": false, 00:16:12.916 "num_base_bdevs": 3, 00:16:12.916 "num_base_bdevs_discovered": 2, 00:16:12.916 "num_base_bdevs_operational": 3, 00:16:12.916 "base_bdevs_list": [ 00:16:12.916 { 00:16:12.916 "name": "BaseBdev1", 00:16:12.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.916 "is_configured": false, 00:16:12.916 "data_offset": 0, 00:16:12.916 "data_size": 0 00:16:12.916 }, 00:16:12.916 { 00:16:12.916 "name": "BaseBdev2", 00:16:12.916 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:12.916 "is_configured": true, 00:16:12.916 "data_offset": 0, 00:16:12.916 "data_size": 65536 00:16:12.916 }, 00:16:12.916 { 00:16:12.916 "name": "BaseBdev3", 00:16:12.916 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:12.916 "is_configured": true, 00:16:12.916 "data_offset": 0, 00:16:12.916 "data_size": 65536 00:16:12.916 } 00:16:12.916 ] 00:16:12.916 }' 00:16:12.916 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.916 10:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:13.854 [2024-07-12 10:42:48.934638] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.854 10:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.112 10:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.113 "name": "Existed_Raid", 00:16:14.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.113 "strip_size_kb": 0, 00:16:14.113 "state": "configuring", 00:16:14.113 "raid_level": "raid1", 00:16:14.113 "superblock": false, 00:16:14.113 "num_base_bdevs": 3, 00:16:14.113 "num_base_bdevs_discovered": 1, 00:16:14.113 "num_base_bdevs_operational": 3, 00:16:14.113 "base_bdevs_list": [ 00:16:14.113 { 00:16:14.113 "name": "BaseBdev1", 00:16:14.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.113 "is_configured": false, 00:16:14.113 "data_offset": 0, 00:16:14.113 "data_size": 0 00:16:14.113 }, 00:16:14.113 { 00:16:14.113 "name": null, 00:16:14.113 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:14.113 "is_configured": false, 00:16:14.113 "data_offset": 0, 00:16:14.113 "data_size": 65536 00:16:14.113 }, 00:16:14.113 { 00:16:14.113 "name": "BaseBdev3", 00:16:14.113 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:14.113 "is_configured": true, 00:16:14.113 "data_offset": 0, 00:16:14.113 "data_size": 65536 00:16:14.113 } 00:16:14.113 ] 00:16:14.113 }' 00:16:14.113 10:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.113 10:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.679 10:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.679 10:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:14.937 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:14.937 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:15.196 [2024-07-12 10:42:50.298764] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:15.196 BaseBdev1 00:16:15.196 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:15.196 10:42:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:15.196 10:42:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:15.196 10:42:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:15.196 10:42:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:15.196 10:42:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:15.196 10:42:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.454 10:42:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:15.714 [ 00:16:15.714 { 00:16:15.714 "name": "BaseBdev1", 00:16:15.714 "aliases": [ 00:16:15.714 "3c1fd23c-f610-4bba-8695-f53aa96eda59" 00:16:15.714 ], 00:16:15.714 "product_name": "Malloc disk", 00:16:15.714 "block_size": 512, 00:16:15.714 "num_blocks": 65536, 00:16:15.714 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:15.714 "assigned_rate_limits": { 00:16:15.714 "rw_ios_per_sec": 0, 00:16:15.714 "rw_mbytes_per_sec": 0, 00:16:15.714 "r_mbytes_per_sec": 0, 00:16:15.714 "w_mbytes_per_sec": 0 00:16:15.714 }, 00:16:15.714 "claimed": true, 00:16:15.714 "claim_type": "exclusive_write", 00:16:15.714 "zoned": false, 00:16:15.714 "supported_io_types": { 00:16:15.714 "read": true, 00:16:15.714 "write": true, 00:16:15.714 "unmap": true, 00:16:15.714 "flush": true, 00:16:15.714 "reset": true, 00:16:15.714 "nvme_admin": false, 00:16:15.714 "nvme_io": false, 00:16:15.714 "nvme_io_md": false, 00:16:15.714 "write_zeroes": true, 00:16:15.714 "zcopy": true, 00:16:15.714 "get_zone_info": false, 00:16:15.714 "zone_management": false, 00:16:15.714 "zone_append": false, 00:16:15.714 "compare": false, 00:16:15.714 "compare_and_write": false, 00:16:15.714 "abort": true, 00:16:15.714 "seek_hole": false, 00:16:15.714 "seek_data": false, 00:16:15.714 "copy": true, 00:16:15.714 "nvme_iov_md": false 00:16:15.714 }, 00:16:15.714 "memory_domains": [ 00:16:15.714 { 00:16:15.714 "dma_device_id": "system", 00:16:15.714 "dma_device_type": 1 00:16:15.714 }, 00:16:15.714 { 00:16:15.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.714 "dma_device_type": 2 00:16:15.714 } 00:16:15.714 ], 00:16:15.714 "driver_specific": {} 00:16:15.714 } 00:16:15.714 ] 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.714 10:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.974 10:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.974 "name": "Existed_Raid", 00:16:15.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.974 "strip_size_kb": 0, 00:16:15.974 "state": "configuring", 00:16:15.974 "raid_level": "raid1", 00:16:15.974 "superblock": false, 00:16:15.974 "num_base_bdevs": 3, 00:16:15.974 "num_base_bdevs_discovered": 2, 00:16:15.974 "num_base_bdevs_operational": 3, 00:16:15.974 "base_bdevs_list": [ 00:16:15.974 { 00:16:15.974 "name": "BaseBdev1", 00:16:15.974 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:15.974 "is_configured": true, 00:16:15.974 "data_offset": 0, 00:16:15.974 "data_size": 65536 00:16:15.974 }, 00:16:15.974 { 00:16:15.974 "name": null, 00:16:15.974 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:15.974 "is_configured": false, 00:16:15.974 "data_offset": 0, 00:16:15.974 "data_size": 65536 00:16:15.974 }, 00:16:15.974 { 00:16:15.974 "name": "BaseBdev3", 00:16:15.974 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:15.974 "is_configured": true, 00:16:15.974 "data_offset": 0, 00:16:15.974 "data_size": 65536 00:16:15.974 } 00:16:15.974 ] 00:16:15.974 }' 00:16:15.974 10:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.974 10:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.542 10:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.542 10:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:16.801 10:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:16.801 10:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:17.060 [2024-07-12 10:42:52.111612] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.060 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.318 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.318 "name": "Existed_Raid", 00:16:17.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.318 "strip_size_kb": 0, 00:16:17.318 "state": "configuring", 00:16:17.319 "raid_level": "raid1", 00:16:17.319 "superblock": false, 00:16:17.319 "num_base_bdevs": 3, 00:16:17.319 "num_base_bdevs_discovered": 1, 00:16:17.319 "num_base_bdevs_operational": 3, 00:16:17.319 "base_bdevs_list": [ 00:16:17.319 { 00:16:17.319 "name": "BaseBdev1", 00:16:17.319 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:17.319 "is_configured": true, 00:16:17.319 "data_offset": 0, 00:16:17.319 "data_size": 65536 00:16:17.319 }, 00:16:17.319 { 00:16:17.319 "name": null, 00:16:17.319 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:17.319 "is_configured": false, 00:16:17.319 "data_offset": 0, 00:16:17.319 "data_size": 65536 00:16:17.319 }, 00:16:17.319 { 00:16:17.319 "name": null, 00:16:17.319 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:17.319 "is_configured": false, 00:16:17.319 "data_offset": 0, 00:16:17.319 "data_size": 65536 00:16:17.319 } 00:16:17.319 ] 00:16:17.319 }' 00:16:17.319 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.319 10:42:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.886 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.886 10:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:18.144 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:18.144 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:18.402 [2024-07-12 10:42:53.447166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.402 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.661 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.661 "name": "Existed_Raid", 00:16:18.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.661 "strip_size_kb": 0, 00:16:18.661 "state": "configuring", 00:16:18.661 "raid_level": "raid1", 00:16:18.661 "superblock": false, 00:16:18.661 "num_base_bdevs": 3, 00:16:18.661 "num_base_bdevs_discovered": 2, 00:16:18.661 "num_base_bdevs_operational": 3, 00:16:18.661 "base_bdevs_list": [ 00:16:18.661 { 00:16:18.661 "name": "BaseBdev1", 00:16:18.661 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:18.661 "is_configured": true, 00:16:18.661 "data_offset": 0, 00:16:18.661 "data_size": 65536 00:16:18.661 }, 00:16:18.661 { 00:16:18.661 "name": null, 00:16:18.661 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:18.661 "is_configured": false, 00:16:18.661 "data_offset": 0, 00:16:18.661 "data_size": 65536 00:16:18.661 }, 00:16:18.661 { 00:16:18.661 "name": "BaseBdev3", 00:16:18.661 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:18.661 "is_configured": true, 00:16:18.661 "data_offset": 0, 00:16:18.661 "data_size": 65536 00:16:18.661 } 00:16:18.661 ] 00:16:18.661 }' 00:16:18.661 10:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.661 10:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.226 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.226 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:19.484 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:19.485 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:19.743 [2024-07-12 10:42:54.714764] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:19.743 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:19.743 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.744 10:42:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.310 10:42:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.310 "name": "Existed_Raid", 00:16:20.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.310 "strip_size_kb": 0, 00:16:20.310 "state": "configuring", 00:16:20.310 "raid_level": "raid1", 00:16:20.310 "superblock": false, 00:16:20.310 "num_base_bdevs": 3, 00:16:20.310 "num_base_bdevs_discovered": 1, 00:16:20.310 "num_base_bdevs_operational": 3, 00:16:20.310 "base_bdevs_list": [ 00:16:20.310 { 00:16:20.310 "name": null, 00:16:20.310 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:20.310 "is_configured": false, 00:16:20.311 "data_offset": 0, 00:16:20.311 "data_size": 65536 00:16:20.311 }, 00:16:20.311 { 00:16:20.311 "name": null, 00:16:20.311 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:20.311 "is_configured": false, 00:16:20.311 "data_offset": 0, 00:16:20.311 "data_size": 65536 00:16:20.311 }, 00:16:20.311 { 00:16:20.311 "name": "BaseBdev3", 00:16:20.311 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:20.311 "is_configured": true, 00:16:20.311 "data_offset": 0, 00:16:20.311 "data_size": 65536 00:16:20.311 } 00:16:20.311 ] 00:16:20.311 }' 00:16:20.311 10:42:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.311 10:42:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.875 10:42:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.875 10:42:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:21.133 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:21.133 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:21.133 [2024-07-12 10:42:56.327656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:21.390 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.391 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:21.648 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.648 "name": "Existed_Raid", 00:16:21.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:21.648 "strip_size_kb": 0, 00:16:21.648 "state": "configuring", 00:16:21.648 "raid_level": "raid1", 00:16:21.648 "superblock": false, 00:16:21.648 "num_base_bdevs": 3, 00:16:21.648 "num_base_bdevs_discovered": 2, 00:16:21.648 "num_base_bdevs_operational": 3, 00:16:21.648 "base_bdevs_list": [ 00:16:21.648 { 00:16:21.648 "name": null, 00:16:21.648 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:21.648 "is_configured": false, 00:16:21.648 "data_offset": 0, 00:16:21.648 "data_size": 65536 00:16:21.648 }, 00:16:21.648 { 00:16:21.648 "name": "BaseBdev2", 00:16:21.648 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:21.648 "is_configured": true, 00:16:21.648 "data_offset": 0, 00:16:21.648 "data_size": 65536 00:16:21.648 }, 00:16:21.648 { 00:16:21.648 "name": "BaseBdev3", 00:16:21.648 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:21.648 "is_configured": true, 00:16:21.648 "data_offset": 0, 00:16:21.648 "data_size": 65536 00:16:21.648 } 00:16:21.648 ] 00:16:21.648 }' 00:16:21.648 10:42:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.648 10:42:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.213 10:42:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.213 10:42:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:22.525 10:42:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:22.525 10:42:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.525 10:42:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:22.785 10:42:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3c1fd23c-f610-4bba-8695-f53aa96eda59 00:16:23.044 [2024-07-12 10:42:58.189079] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:23.044 [2024-07-12 10:42:58.189119] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x196ee40 00:16:23.044 [2024-07-12 10:42:58.189127] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:23.044 [2024-07-12 10:42:58.189315] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x196be60 00:16:23.044 [2024-07-12 10:42:58.189434] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x196ee40 00:16:23.044 [2024-07-12 10:42:58.189443] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x196ee40 00:16:23.044 [2024-07-12 10:42:58.189616] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:23.044 NewBaseBdev 00:16:23.044 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:23.044 10:42:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:23.044 10:42:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:23.044 10:42:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:23.044 10:42:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:23.044 10:42:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:23.044 10:42:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:23.304 10:42:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:23.562 [ 00:16:23.562 { 00:16:23.562 "name": "NewBaseBdev", 00:16:23.562 "aliases": [ 00:16:23.562 "3c1fd23c-f610-4bba-8695-f53aa96eda59" 00:16:23.562 ], 00:16:23.562 "product_name": "Malloc disk", 00:16:23.562 "block_size": 512, 00:16:23.562 "num_blocks": 65536, 00:16:23.562 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:23.562 "assigned_rate_limits": { 00:16:23.562 "rw_ios_per_sec": 0, 00:16:23.562 "rw_mbytes_per_sec": 0, 00:16:23.562 "r_mbytes_per_sec": 0, 00:16:23.562 "w_mbytes_per_sec": 0 00:16:23.562 }, 00:16:23.562 "claimed": true, 00:16:23.562 "claim_type": "exclusive_write", 00:16:23.562 "zoned": false, 00:16:23.562 "supported_io_types": { 00:16:23.562 "read": true, 00:16:23.562 "write": true, 00:16:23.562 "unmap": true, 00:16:23.562 "flush": true, 00:16:23.562 "reset": true, 00:16:23.562 "nvme_admin": false, 00:16:23.562 "nvme_io": false, 00:16:23.562 "nvme_io_md": false, 00:16:23.562 "write_zeroes": true, 00:16:23.562 "zcopy": true, 00:16:23.563 "get_zone_info": false, 00:16:23.563 "zone_management": false, 00:16:23.563 "zone_append": false, 00:16:23.563 "compare": false, 00:16:23.563 "compare_and_write": false, 00:16:23.563 "abort": true, 00:16:23.563 "seek_hole": false, 00:16:23.563 "seek_data": false, 00:16:23.563 "copy": true, 00:16:23.563 "nvme_iov_md": false 00:16:23.563 }, 00:16:23.563 "memory_domains": [ 00:16:23.563 { 00:16:23.563 "dma_device_id": "system", 00:16:23.563 "dma_device_type": 1 00:16:23.563 }, 00:16:23.563 { 00:16:23.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.563 "dma_device_type": 2 00:16:23.563 } 00:16:23.563 ], 00:16:23.563 "driver_specific": {} 00:16:23.563 } 00:16:23.563 ] 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.563 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.821 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.821 "name": "Existed_Raid", 00:16:23.821 "uuid": "dc6e1308-72c4-4a23-a5ce-c0da9bdf551d", 00:16:23.821 "strip_size_kb": 0, 00:16:23.821 "state": "online", 00:16:23.821 "raid_level": "raid1", 00:16:23.821 "superblock": false, 00:16:23.821 "num_base_bdevs": 3, 00:16:23.821 "num_base_bdevs_discovered": 3, 00:16:23.821 "num_base_bdevs_operational": 3, 00:16:23.821 "base_bdevs_list": [ 00:16:23.821 { 00:16:23.821 "name": "NewBaseBdev", 00:16:23.821 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:23.821 "is_configured": true, 00:16:23.821 "data_offset": 0, 00:16:23.821 "data_size": 65536 00:16:23.821 }, 00:16:23.821 { 00:16:23.821 "name": "BaseBdev2", 00:16:23.821 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:23.821 "is_configured": true, 00:16:23.821 "data_offset": 0, 00:16:23.821 "data_size": 65536 00:16:23.821 }, 00:16:23.821 { 00:16:23.821 "name": "BaseBdev3", 00:16:23.821 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:23.821 "is_configured": true, 00:16:23.821 "data_offset": 0, 00:16:23.821 "data_size": 65536 00:16:23.821 } 00:16:23.821 ] 00:16:23.821 }' 00:16:23.821 10:42:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.821 10:42:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.388 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:24.388 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:24.388 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:24.388 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:24.388 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:24.388 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:24.388 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:24.389 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:24.646 [2024-07-12 10:42:59.777605] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:24.646 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:24.647 "name": "Existed_Raid", 00:16:24.647 "aliases": [ 00:16:24.647 "dc6e1308-72c4-4a23-a5ce-c0da9bdf551d" 00:16:24.647 ], 00:16:24.647 "product_name": "Raid Volume", 00:16:24.647 "block_size": 512, 00:16:24.647 "num_blocks": 65536, 00:16:24.647 "uuid": "dc6e1308-72c4-4a23-a5ce-c0da9bdf551d", 00:16:24.647 "assigned_rate_limits": { 00:16:24.647 "rw_ios_per_sec": 0, 00:16:24.647 "rw_mbytes_per_sec": 0, 00:16:24.647 "r_mbytes_per_sec": 0, 00:16:24.647 "w_mbytes_per_sec": 0 00:16:24.647 }, 00:16:24.647 "claimed": false, 00:16:24.647 "zoned": false, 00:16:24.647 "supported_io_types": { 00:16:24.647 "read": true, 00:16:24.647 "write": true, 00:16:24.647 "unmap": false, 00:16:24.647 "flush": false, 00:16:24.647 "reset": true, 00:16:24.647 "nvme_admin": false, 00:16:24.647 "nvme_io": false, 00:16:24.647 "nvme_io_md": false, 00:16:24.647 "write_zeroes": true, 00:16:24.647 "zcopy": false, 00:16:24.647 "get_zone_info": false, 00:16:24.647 "zone_management": false, 00:16:24.647 "zone_append": false, 00:16:24.647 "compare": false, 00:16:24.647 "compare_and_write": false, 00:16:24.647 "abort": false, 00:16:24.647 "seek_hole": false, 00:16:24.647 "seek_data": false, 00:16:24.647 "copy": false, 00:16:24.647 "nvme_iov_md": false 00:16:24.647 }, 00:16:24.647 "memory_domains": [ 00:16:24.647 { 00:16:24.647 "dma_device_id": "system", 00:16:24.647 "dma_device_type": 1 00:16:24.647 }, 00:16:24.647 { 00:16:24.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.647 "dma_device_type": 2 00:16:24.647 }, 00:16:24.647 { 00:16:24.647 "dma_device_id": "system", 00:16:24.647 "dma_device_type": 1 00:16:24.647 }, 00:16:24.647 { 00:16:24.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.647 "dma_device_type": 2 00:16:24.647 }, 00:16:24.647 { 00:16:24.647 "dma_device_id": "system", 00:16:24.647 "dma_device_type": 1 00:16:24.647 }, 00:16:24.647 { 00:16:24.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.647 "dma_device_type": 2 00:16:24.647 } 00:16:24.647 ], 00:16:24.647 "driver_specific": { 00:16:24.647 "raid": { 00:16:24.647 "uuid": "dc6e1308-72c4-4a23-a5ce-c0da9bdf551d", 00:16:24.647 "strip_size_kb": 0, 00:16:24.647 "state": "online", 00:16:24.647 "raid_level": "raid1", 00:16:24.647 "superblock": false, 00:16:24.647 "num_base_bdevs": 3, 00:16:24.647 "num_base_bdevs_discovered": 3, 00:16:24.647 "num_base_bdevs_operational": 3, 00:16:24.647 "base_bdevs_list": [ 00:16:24.647 { 00:16:24.647 "name": "NewBaseBdev", 00:16:24.647 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:24.647 "is_configured": true, 00:16:24.647 "data_offset": 0, 00:16:24.647 "data_size": 65536 00:16:24.647 }, 00:16:24.647 { 00:16:24.647 "name": "BaseBdev2", 00:16:24.647 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:24.647 "is_configured": true, 00:16:24.647 "data_offset": 0, 00:16:24.647 "data_size": 65536 00:16:24.647 }, 00:16:24.647 { 00:16:24.647 "name": "BaseBdev3", 00:16:24.647 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:24.647 "is_configured": true, 00:16:24.647 "data_offset": 0, 00:16:24.647 "data_size": 65536 00:16:24.647 } 00:16:24.647 ] 00:16:24.647 } 00:16:24.647 } 00:16:24.647 }' 00:16:24.647 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:24.905 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:24.905 BaseBdev2 00:16:24.905 BaseBdev3' 00:16:24.905 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:24.905 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:24.905 10:42:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.164 "name": "NewBaseBdev", 00:16:25.164 "aliases": [ 00:16:25.164 "3c1fd23c-f610-4bba-8695-f53aa96eda59" 00:16:25.164 ], 00:16:25.164 "product_name": "Malloc disk", 00:16:25.164 "block_size": 512, 00:16:25.164 "num_blocks": 65536, 00:16:25.164 "uuid": "3c1fd23c-f610-4bba-8695-f53aa96eda59", 00:16:25.164 "assigned_rate_limits": { 00:16:25.164 "rw_ios_per_sec": 0, 00:16:25.164 "rw_mbytes_per_sec": 0, 00:16:25.164 "r_mbytes_per_sec": 0, 00:16:25.164 "w_mbytes_per_sec": 0 00:16:25.164 }, 00:16:25.164 "claimed": true, 00:16:25.164 "claim_type": "exclusive_write", 00:16:25.164 "zoned": false, 00:16:25.164 "supported_io_types": { 00:16:25.164 "read": true, 00:16:25.164 "write": true, 00:16:25.164 "unmap": true, 00:16:25.164 "flush": true, 00:16:25.164 "reset": true, 00:16:25.164 "nvme_admin": false, 00:16:25.164 "nvme_io": false, 00:16:25.164 "nvme_io_md": false, 00:16:25.164 "write_zeroes": true, 00:16:25.164 "zcopy": true, 00:16:25.164 "get_zone_info": false, 00:16:25.164 "zone_management": false, 00:16:25.164 "zone_append": false, 00:16:25.164 "compare": false, 00:16:25.164 "compare_and_write": false, 00:16:25.164 "abort": true, 00:16:25.164 "seek_hole": false, 00:16:25.164 "seek_data": false, 00:16:25.164 "copy": true, 00:16:25.164 "nvme_iov_md": false 00:16:25.164 }, 00:16:25.164 "memory_domains": [ 00:16:25.164 { 00:16:25.164 "dma_device_id": "system", 00:16:25.164 "dma_device_type": 1 00:16:25.164 }, 00:16:25.164 { 00:16:25.164 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.164 "dma_device_type": 2 00:16:25.164 } 00:16:25.164 ], 00:16:25.164 "driver_specific": {} 00:16:25.164 }' 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.164 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.422 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.422 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.422 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.422 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.422 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:25.422 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.422 "name": "BaseBdev2", 00:16:25.422 "aliases": [ 00:16:25.422 "250127b3-0d7a-4296-90a2-3d90ca40a510" 00:16:25.422 ], 00:16:25.422 "product_name": "Malloc disk", 00:16:25.422 "block_size": 512, 00:16:25.422 "num_blocks": 65536, 00:16:25.422 "uuid": "250127b3-0d7a-4296-90a2-3d90ca40a510", 00:16:25.422 "assigned_rate_limits": { 00:16:25.422 "rw_ios_per_sec": 0, 00:16:25.422 "rw_mbytes_per_sec": 0, 00:16:25.422 "r_mbytes_per_sec": 0, 00:16:25.422 "w_mbytes_per_sec": 0 00:16:25.422 }, 00:16:25.422 "claimed": true, 00:16:25.422 "claim_type": "exclusive_write", 00:16:25.422 "zoned": false, 00:16:25.422 "supported_io_types": { 00:16:25.422 "read": true, 00:16:25.422 "write": true, 00:16:25.422 "unmap": true, 00:16:25.422 "flush": true, 00:16:25.422 "reset": true, 00:16:25.422 "nvme_admin": false, 00:16:25.422 "nvme_io": false, 00:16:25.422 "nvme_io_md": false, 00:16:25.422 "write_zeroes": true, 00:16:25.422 "zcopy": true, 00:16:25.422 "get_zone_info": false, 00:16:25.422 "zone_management": false, 00:16:25.422 "zone_append": false, 00:16:25.422 "compare": false, 00:16:25.422 "compare_and_write": false, 00:16:25.422 "abort": true, 00:16:25.422 "seek_hole": false, 00:16:25.422 "seek_data": false, 00:16:25.422 "copy": true, 00:16:25.422 "nvme_iov_md": false 00:16:25.422 }, 00:16:25.422 "memory_domains": [ 00:16:25.422 { 00:16:25.422 "dma_device_id": "system", 00:16:25.422 "dma_device_type": 1 00:16:25.422 }, 00:16:25.422 { 00:16:25.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.422 "dma_device_type": 2 00:16:25.422 } 00:16:25.422 ], 00:16:25.422 "driver_specific": {} 00:16:25.422 }' 00:16:25.422 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.681 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.681 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.681 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.681 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.681 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.681 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.681 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.939 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.939 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.939 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.939 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.939 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.939 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:25.939 10:43:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:26.197 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:26.197 "name": "BaseBdev3", 00:16:26.197 "aliases": [ 00:16:26.197 "c2cf05e3-86ff-496b-94b4-054f141ec5ed" 00:16:26.197 ], 00:16:26.197 "product_name": "Malloc disk", 00:16:26.197 "block_size": 512, 00:16:26.197 "num_blocks": 65536, 00:16:26.197 "uuid": "c2cf05e3-86ff-496b-94b4-054f141ec5ed", 00:16:26.197 "assigned_rate_limits": { 00:16:26.197 "rw_ios_per_sec": 0, 00:16:26.197 "rw_mbytes_per_sec": 0, 00:16:26.197 "r_mbytes_per_sec": 0, 00:16:26.197 "w_mbytes_per_sec": 0 00:16:26.197 }, 00:16:26.197 "claimed": true, 00:16:26.197 "claim_type": "exclusive_write", 00:16:26.197 "zoned": false, 00:16:26.197 "supported_io_types": { 00:16:26.197 "read": true, 00:16:26.197 "write": true, 00:16:26.197 "unmap": true, 00:16:26.197 "flush": true, 00:16:26.197 "reset": true, 00:16:26.197 "nvme_admin": false, 00:16:26.197 "nvme_io": false, 00:16:26.197 "nvme_io_md": false, 00:16:26.197 "write_zeroes": true, 00:16:26.197 "zcopy": true, 00:16:26.197 "get_zone_info": false, 00:16:26.197 "zone_management": false, 00:16:26.197 "zone_append": false, 00:16:26.197 "compare": false, 00:16:26.197 "compare_and_write": false, 00:16:26.197 "abort": true, 00:16:26.197 "seek_hole": false, 00:16:26.197 "seek_data": false, 00:16:26.197 "copy": true, 00:16:26.197 "nvme_iov_md": false 00:16:26.197 }, 00:16:26.197 "memory_domains": [ 00:16:26.197 { 00:16:26.197 "dma_device_id": "system", 00:16:26.197 "dma_device_type": 1 00:16:26.197 }, 00:16:26.197 { 00:16:26.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.197 "dma_device_type": 2 00:16:26.197 } 00:16:26.197 ], 00:16:26.197 "driver_specific": {} 00:16:26.197 }' 00:16:26.197 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.197 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.197 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:26.197 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.197 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.197 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:26.197 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.454 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.454 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.454 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.454 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.454 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.454 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:26.712 [2024-07-12 10:43:01.786789] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:26.712 [2024-07-12 10:43:01.786822] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:26.712 [2024-07-12 10:43:01.786876] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:26.712 [2024-07-12 10:43:01.787138] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:26.712 [2024-07-12 10:43:01.787149] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x196ee40 name Existed_Raid, state offline 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2061985 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2061985 ']' 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2061985 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2061985 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2061985' 00:16:26.712 killing process with pid 2061985 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2061985 00:16:26.712 [2024-07-12 10:43:01.855248] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:26.712 10:43:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2061985 00:16:26.712 [2024-07-12 10:43:01.882155] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:26.971 10:43:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:26.971 00:16:26.971 real 0m28.647s 00:16:26.971 user 0m52.579s 00:16:26.971 sys 0m5.116s 00:16:26.971 10:43:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:26.971 10:43:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.971 ************************************ 00:16:26.971 END TEST raid_state_function_test 00:16:26.971 ************************************ 00:16:26.971 10:43:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:26.971 10:43:02 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:16:26.971 10:43:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:26.971 10:43:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:26.971 10:43:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:27.229 ************************************ 00:16:27.229 START TEST raid_state_function_test_sb 00:16:27.229 ************************************ 00:16:27.229 10:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:16:27.229 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:27.229 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:27.229 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:27.229 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2066298 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2066298' 00:16:27.230 Process raid pid: 2066298 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2066298 /var/tmp/spdk-raid.sock 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2066298 ']' 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:27.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:27.230 10:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:27.230 [2024-07-12 10:43:02.258190] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:16:27.230 [2024-07-12 10:43:02.258264] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:27.230 [2024-07-12 10:43:02.388920] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:27.488 [2024-07-12 10:43:02.493098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:27.488 [2024-07-12 10:43:02.557935] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:27.488 [2024-07-12 10:43:02.557970] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.057 10:43:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:28.057 10:43:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:28.057 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:28.315 [2024-07-12 10:43:03.273237] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:28.315 [2024-07-12 10:43:03.273281] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:28.315 [2024-07-12 10:43:03.273296] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:28.316 [2024-07-12 10:43:03.273308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:28.316 [2024-07-12 10:43:03.273317] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:28.316 [2024-07-12 10:43:03.273328] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.316 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:28.573 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.573 "name": "Existed_Raid", 00:16:28.573 "uuid": "c1cc9f7e-ee0a-42ae-9e8c-00beead105cb", 00:16:28.573 "strip_size_kb": 0, 00:16:28.573 "state": "configuring", 00:16:28.573 "raid_level": "raid1", 00:16:28.573 "superblock": true, 00:16:28.573 "num_base_bdevs": 3, 00:16:28.573 "num_base_bdevs_discovered": 0, 00:16:28.573 "num_base_bdevs_operational": 3, 00:16:28.573 "base_bdevs_list": [ 00:16:28.573 { 00:16:28.573 "name": "BaseBdev1", 00:16:28.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.573 "is_configured": false, 00:16:28.573 "data_offset": 0, 00:16:28.573 "data_size": 0 00:16:28.573 }, 00:16:28.573 { 00:16:28.573 "name": "BaseBdev2", 00:16:28.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.573 "is_configured": false, 00:16:28.573 "data_offset": 0, 00:16:28.573 "data_size": 0 00:16:28.573 }, 00:16:28.573 { 00:16:28.573 "name": "BaseBdev3", 00:16:28.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.573 "is_configured": false, 00:16:28.573 "data_offset": 0, 00:16:28.573 "data_size": 0 00:16:28.573 } 00:16:28.573 ] 00:16:28.573 }' 00:16:28.573 10:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.573 10:43:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:29.139 10:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:29.397 [2024-07-12 10:43:04.347938] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:29.397 [2024-07-12 10:43:04.347972] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb08a80 name Existed_Raid, state configuring 00:16:29.397 10:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:29.654 [2024-07-12 10:43:04.592617] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:29.654 [2024-07-12 10:43:04.592653] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:29.654 [2024-07-12 10:43:04.592663] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:29.654 [2024-07-12 10:43:04.592675] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:29.654 [2024-07-12 10:43:04.592693] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:29.654 [2024-07-12 10:43:04.592705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:29.654 10:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:29.654 [2024-07-12 10:43:04.848358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:29.654 BaseBdev1 00:16:29.912 10:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:29.912 10:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:29.912 10:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:29.912 10:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:29.912 10:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:29.912 10:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:29.912 10:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:30.170 [ 00:16:30.170 { 00:16:30.170 "name": "BaseBdev1", 00:16:30.170 "aliases": [ 00:16:30.170 "965bcd67-2085-42e7-9175-ae620f9ae0e9" 00:16:30.170 ], 00:16:30.170 "product_name": "Malloc disk", 00:16:30.170 "block_size": 512, 00:16:30.170 "num_blocks": 65536, 00:16:30.170 "uuid": "965bcd67-2085-42e7-9175-ae620f9ae0e9", 00:16:30.170 "assigned_rate_limits": { 00:16:30.170 "rw_ios_per_sec": 0, 00:16:30.170 "rw_mbytes_per_sec": 0, 00:16:30.170 "r_mbytes_per_sec": 0, 00:16:30.170 "w_mbytes_per_sec": 0 00:16:30.170 }, 00:16:30.170 "claimed": true, 00:16:30.170 "claim_type": "exclusive_write", 00:16:30.170 "zoned": false, 00:16:30.170 "supported_io_types": { 00:16:30.170 "read": true, 00:16:30.170 "write": true, 00:16:30.170 "unmap": true, 00:16:30.170 "flush": true, 00:16:30.170 "reset": true, 00:16:30.170 "nvme_admin": false, 00:16:30.170 "nvme_io": false, 00:16:30.170 "nvme_io_md": false, 00:16:30.170 "write_zeroes": true, 00:16:30.170 "zcopy": true, 00:16:30.170 "get_zone_info": false, 00:16:30.170 "zone_management": false, 00:16:30.170 "zone_append": false, 00:16:30.170 "compare": false, 00:16:30.170 "compare_and_write": false, 00:16:30.170 "abort": true, 00:16:30.170 "seek_hole": false, 00:16:30.170 "seek_data": false, 00:16:30.170 "copy": true, 00:16:30.170 "nvme_iov_md": false 00:16:30.170 }, 00:16:30.170 "memory_domains": [ 00:16:30.170 { 00:16:30.170 "dma_device_id": "system", 00:16:30.170 "dma_device_type": 1 00:16:30.170 }, 00:16:30.170 { 00:16:30.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.170 "dma_device_type": 2 00:16:30.170 } 00:16:30.170 ], 00:16:30.170 "driver_specific": {} 00:16:30.170 } 00:16:30.170 ] 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.170 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:30.428 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:30.428 "name": "Existed_Raid", 00:16:30.428 "uuid": "390eb0d6-c091-44a9-b658-cb794e8d6b57", 00:16:30.428 "strip_size_kb": 0, 00:16:30.428 "state": "configuring", 00:16:30.428 "raid_level": "raid1", 00:16:30.428 "superblock": true, 00:16:30.428 "num_base_bdevs": 3, 00:16:30.428 "num_base_bdevs_discovered": 1, 00:16:30.428 "num_base_bdevs_operational": 3, 00:16:30.428 "base_bdevs_list": [ 00:16:30.428 { 00:16:30.428 "name": "BaseBdev1", 00:16:30.428 "uuid": "965bcd67-2085-42e7-9175-ae620f9ae0e9", 00:16:30.428 "is_configured": true, 00:16:30.428 "data_offset": 2048, 00:16:30.428 "data_size": 63488 00:16:30.428 }, 00:16:30.428 { 00:16:30.428 "name": "BaseBdev2", 00:16:30.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.428 "is_configured": false, 00:16:30.428 "data_offset": 0, 00:16:30.428 "data_size": 0 00:16:30.428 }, 00:16:30.428 { 00:16:30.428 "name": "BaseBdev3", 00:16:30.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.428 "is_configured": false, 00:16:30.428 "data_offset": 0, 00:16:30.428 "data_size": 0 00:16:30.428 } 00:16:30.428 ] 00:16:30.428 }' 00:16:30.428 10:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:30.428 10:43:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.360 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:31.360 [2024-07-12 10:43:06.424549] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:31.360 [2024-07-12 10:43:06.424589] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb08310 name Existed_Raid, state configuring 00:16:31.360 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:31.617 [2024-07-12 10:43:06.669233] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:31.617 [2024-07-12 10:43:06.670677] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:31.618 [2024-07-12 10:43:06.670712] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:31.618 [2024-07-12 10:43:06.670722] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:31.618 [2024-07-12 10:43:06.670733] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.618 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.875 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.875 "name": "Existed_Raid", 00:16:31.875 "uuid": "317e4ccf-9906-4d2d-b385-c41b451d852c", 00:16:31.875 "strip_size_kb": 0, 00:16:31.875 "state": "configuring", 00:16:31.875 "raid_level": "raid1", 00:16:31.875 "superblock": true, 00:16:31.875 "num_base_bdevs": 3, 00:16:31.875 "num_base_bdevs_discovered": 1, 00:16:31.875 "num_base_bdevs_operational": 3, 00:16:31.875 "base_bdevs_list": [ 00:16:31.875 { 00:16:31.875 "name": "BaseBdev1", 00:16:31.875 "uuid": "965bcd67-2085-42e7-9175-ae620f9ae0e9", 00:16:31.875 "is_configured": true, 00:16:31.875 "data_offset": 2048, 00:16:31.875 "data_size": 63488 00:16:31.875 }, 00:16:31.875 { 00:16:31.875 "name": "BaseBdev2", 00:16:31.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.875 "is_configured": false, 00:16:31.875 "data_offset": 0, 00:16:31.875 "data_size": 0 00:16:31.875 }, 00:16:31.875 { 00:16:31.875 "name": "BaseBdev3", 00:16:31.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.875 "is_configured": false, 00:16:31.875 "data_offset": 0, 00:16:31.875 "data_size": 0 00:16:31.875 } 00:16:31.875 ] 00:16:31.875 }' 00:16:31.875 10:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.875 10:43:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:32.440 10:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:32.697 [2024-07-12 10:43:07.803983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:32.697 BaseBdev2 00:16:32.697 10:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:32.697 10:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:32.697 10:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:32.697 10:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:32.697 10:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:32.697 10:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:32.697 10:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:32.955 10:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:33.212 [ 00:16:33.212 { 00:16:33.212 "name": "BaseBdev2", 00:16:33.212 "aliases": [ 00:16:33.212 "f600c2b6-6851-482a-b35c-e8590acaa353" 00:16:33.212 ], 00:16:33.212 "product_name": "Malloc disk", 00:16:33.212 "block_size": 512, 00:16:33.212 "num_blocks": 65536, 00:16:33.212 "uuid": "f600c2b6-6851-482a-b35c-e8590acaa353", 00:16:33.212 "assigned_rate_limits": { 00:16:33.212 "rw_ios_per_sec": 0, 00:16:33.212 "rw_mbytes_per_sec": 0, 00:16:33.212 "r_mbytes_per_sec": 0, 00:16:33.212 "w_mbytes_per_sec": 0 00:16:33.212 }, 00:16:33.212 "claimed": true, 00:16:33.212 "claim_type": "exclusive_write", 00:16:33.212 "zoned": false, 00:16:33.212 "supported_io_types": { 00:16:33.212 "read": true, 00:16:33.212 "write": true, 00:16:33.212 "unmap": true, 00:16:33.212 "flush": true, 00:16:33.212 "reset": true, 00:16:33.212 "nvme_admin": false, 00:16:33.212 "nvme_io": false, 00:16:33.212 "nvme_io_md": false, 00:16:33.212 "write_zeroes": true, 00:16:33.212 "zcopy": true, 00:16:33.212 "get_zone_info": false, 00:16:33.212 "zone_management": false, 00:16:33.212 "zone_append": false, 00:16:33.212 "compare": false, 00:16:33.212 "compare_and_write": false, 00:16:33.212 "abort": true, 00:16:33.212 "seek_hole": false, 00:16:33.212 "seek_data": false, 00:16:33.212 "copy": true, 00:16:33.212 "nvme_iov_md": false 00:16:33.212 }, 00:16:33.212 "memory_domains": [ 00:16:33.212 { 00:16:33.212 "dma_device_id": "system", 00:16:33.212 "dma_device_type": 1 00:16:33.212 }, 00:16:33.212 { 00:16:33.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.212 "dma_device_type": 2 00:16:33.212 } 00:16:33.212 ], 00:16:33.212 "driver_specific": {} 00:16:33.212 } 00:16:33.212 ] 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.212 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.470 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.470 "name": "Existed_Raid", 00:16:33.470 "uuid": "317e4ccf-9906-4d2d-b385-c41b451d852c", 00:16:33.470 "strip_size_kb": 0, 00:16:33.470 "state": "configuring", 00:16:33.470 "raid_level": "raid1", 00:16:33.470 "superblock": true, 00:16:33.470 "num_base_bdevs": 3, 00:16:33.470 "num_base_bdevs_discovered": 2, 00:16:33.470 "num_base_bdevs_operational": 3, 00:16:33.470 "base_bdevs_list": [ 00:16:33.470 { 00:16:33.470 "name": "BaseBdev1", 00:16:33.470 "uuid": "965bcd67-2085-42e7-9175-ae620f9ae0e9", 00:16:33.470 "is_configured": true, 00:16:33.470 "data_offset": 2048, 00:16:33.470 "data_size": 63488 00:16:33.470 }, 00:16:33.470 { 00:16:33.470 "name": "BaseBdev2", 00:16:33.470 "uuid": "f600c2b6-6851-482a-b35c-e8590acaa353", 00:16:33.470 "is_configured": true, 00:16:33.470 "data_offset": 2048, 00:16:33.470 "data_size": 63488 00:16:33.470 }, 00:16:33.470 { 00:16:33.470 "name": "BaseBdev3", 00:16:33.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.470 "is_configured": false, 00:16:33.470 "data_offset": 0, 00:16:33.470 "data_size": 0 00:16:33.470 } 00:16:33.470 ] 00:16:33.470 }' 00:16:33.470 10:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.470 10:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:34.033 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:34.290 [2024-07-12 10:43:09.367489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:34.290 [2024-07-12 10:43:09.367657] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb09400 00:16:34.290 [2024-07-12 10:43:09.367672] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:34.290 [2024-07-12 10:43:09.367846] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb08ef0 00:16:34.290 [2024-07-12 10:43:09.367964] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb09400 00:16:34.291 [2024-07-12 10:43:09.367974] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb09400 00:16:34.291 [2024-07-12 10:43:09.368063] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:34.291 BaseBdev3 00:16:34.291 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:34.291 10:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:34.291 10:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:34.291 10:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:34.291 10:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:34.291 10:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:34.291 10:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:34.548 10:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:34.805 [ 00:16:34.805 { 00:16:34.805 "name": "BaseBdev3", 00:16:34.805 "aliases": [ 00:16:34.805 "698068df-87dc-4b3d-9437-28a5d2f11142" 00:16:34.805 ], 00:16:34.805 "product_name": "Malloc disk", 00:16:34.805 "block_size": 512, 00:16:34.805 "num_blocks": 65536, 00:16:34.805 "uuid": "698068df-87dc-4b3d-9437-28a5d2f11142", 00:16:34.805 "assigned_rate_limits": { 00:16:34.805 "rw_ios_per_sec": 0, 00:16:34.805 "rw_mbytes_per_sec": 0, 00:16:34.805 "r_mbytes_per_sec": 0, 00:16:34.805 "w_mbytes_per_sec": 0 00:16:34.805 }, 00:16:34.805 "claimed": true, 00:16:34.805 "claim_type": "exclusive_write", 00:16:34.805 "zoned": false, 00:16:34.805 "supported_io_types": { 00:16:34.805 "read": true, 00:16:34.805 "write": true, 00:16:34.805 "unmap": true, 00:16:34.805 "flush": true, 00:16:34.805 "reset": true, 00:16:34.805 "nvme_admin": false, 00:16:34.805 "nvme_io": false, 00:16:34.805 "nvme_io_md": false, 00:16:34.805 "write_zeroes": true, 00:16:34.805 "zcopy": true, 00:16:34.805 "get_zone_info": false, 00:16:34.805 "zone_management": false, 00:16:34.805 "zone_append": false, 00:16:34.805 "compare": false, 00:16:34.805 "compare_and_write": false, 00:16:34.805 "abort": true, 00:16:34.805 "seek_hole": false, 00:16:34.805 "seek_data": false, 00:16:34.805 "copy": true, 00:16:34.805 "nvme_iov_md": false 00:16:34.805 }, 00:16:34.805 "memory_domains": [ 00:16:34.805 { 00:16:34.805 "dma_device_id": "system", 00:16:34.805 "dma_device_type": 1 00:16:34.805 }, 00:16:34.805 { 00:16:34.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.805 "dma_device_type": 2 00:16:34.805 } 00:16:34.805 ], 00:16:34.805 "driver_specific": {} 00:16:34.805 } 00:16:34.805 ] 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.805 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.806 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.806 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.806 10:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.063 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.063 "name": "Existed_Raid", 00:16:35.063 "uuid": "317e4ccf-9906-4d2d-b385-c41b451d852c", 00:16:35.063 "strip_size_kb": 0, 00:16:35.063 "state": "online", 00:16:35.063 "raid_level": "raid1", 00:16:35.063 "superblock": true, 00:16:35.063 "num_base_bdevs": 3, 00:16:35.063 "num_base_bdevs_discovered": 3, 00:16:35.063 "num_base_bdevs_operational": 3, 00:16:35.063 "base_bdevs_list": [ 00:16:35.063 { 00:16:35.063 "name": "BaseBdev1", 00:16:35.063 "uuid": "965bcd67-2085-42e7-9175-ae620f9ae0e9", 00:16:35.063 "is_configured": true, 00:16:35.063 "data_offset": 2048, 00:16:35.063 "data_size": 63488 00:16:35.063 }, 00:16:35.063 { 00:16:35.063 "name": "BaseBdev2", 00:16:35.063 "uuid": "f600c2b6-6851-482a-b35c-e8590acaa353", 00:16:35.063 "is_configured": true, 00:16:35.063 "data_offset": 2048, 00:16:35.063 "data_size": 63488 00:16:35.063 }, 00:16:35.063 { 00:16:35.063 "name": "BaseBdev3", 00:16:35.063 "uuid": "698068df-87dc-4b3d-9437-28a5d2f11142", 00:16:35.063 "is_configured": true, 00:16:35.063 "data_offset": 2048, 00:16:35.063 "data_size": 63488 00:16:35.063 } 00:16:35.063 ] 00:16:35.063 }' 00:16:35.063 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.063 10:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:35.628 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:35.629 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:35.629 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:35.629 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:35.629 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:35.629 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:35.629 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:35.629 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:35.887 [2024-07-12 10:43:10.935953] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:35.887 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:35.887 "name": "Existed_Raid", 00:16:35.887 "aliases": [ 00:16:35.887 "317e4ccf-9906-4d2d-b385-c41b451d852c" 00:16:35.887 ], 00:16:35.887 "product_name": "Raid Volume", 00:16:35.887 "block_size": 512, 00:16:35.887 "num_blocks": 63488, 00:16:35.887 "uuid": "317e4ccf-9906-4d2d-b385-c41b451d852c", 00:16:35.887 "assigned_rate_limits": { 00:16:35.887 "rw_ios_per_sec": 0, 00:16:35.887 "rw_mbytes_per_sec": 0, 00:16:35.887 "r_mbytes_per_sec": 0, 00:16:35.887 "w_mbytes_per_sec": 0 00:16:35.887 }, 00:16:35.887 "claimed": false, 00:16:35.887 "zoned": false, 00:16:35.887 "supported_io_types": { 00:16:35.887 "read": true, 00:16:35.887 "write": true, 00:16:35.887 "unmap": false, 00:16:35.887 "flush": false, 00:16:35.887 "reset": true, 00:16:35.887 "nvme_admin": false, 00:16:35.887 "nvme_io": false, 00:16:35.887 "nvme_io_md": false, 00:16:35.887 "write_zeroes": true, 00:16:35.887 "zcopy": false, 00:16:35.887 "get_zone_info": false, 00:16:35.887 "zone_management": false, 00:16:35.887 "zone_append": false, 00:16:35.887 "compare": false, 00:16:35.887 "compare_and_write": false, 00:16:35.887 "abort": false, 00:16:35.887 "seek_hole": false, 00:16:35.887 "seek_data": false, 00:16:35.887 "copy": false, 00:16:35.887 "nvme_iov_md": false 00:16:35.887 }, 00:16:35.887 "memory_domains": [ 00:16:35.887 { 00:16:35.887 "dma_device_id": "system", 00:16:35.887 "dma_device_type": 1 00:16:35.887 }, 00:16:35.887 { 00:16:35.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.887 "dma_device_type": 2 00:16:35.887 }, 00:16:35.887 { 00:16:35.887 "dma_device_id": "system", 00:16:35.887 "dma_device_type": 1 00:16:35.887 }, 00:16:35.887 { 00:16:35.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.887 "dma_device_type": 2 00:16:35.887 }, 00:16:35.887 { 00:16:35.887 "dma_device_id": "system", 00:16:35.887 "dma_device_type": 1 00:16:35.887 }, 00:16:35.887 { 00:16:35.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.887 "dma_device_type": 2 00:16:35.887 } 00:16:35.887 ], 00:16:35.887 "driver_specific": { 00:16:35.887 "raid": { 00:16:35.887 "uuid": "317e4ccf-9906-4d2d-b385-c41b451d852c", 00:16:35.887 "strip_size_kb": 0, 00:16:35.887 "state": "online", 00:16:35.887 "raid_level": "raid1", 00:16:35.887 "superblock": true, 00:16:35.887 "num_base_bdevs": 3, 00:16:35.887 "num_base_bdevs_discovered": 3, 00:16:35.887 "num_base_bdevs_operational": 3, 00:16:35.887 "base_bdevs_list": [ 00:16:35.887 { 00:16:35.887 "name": "BaseBdev1", 00:16:35.887 "uuid": "965bcd67-2085-42e7-9175-ae620f9ae0e9", 00:16:35.887 "is_configured": true, 00:16:35.887 "data_offset": 2048, 00:16:35.887 "data_size": 63488 00:16:35.887 }, 00:16:35.887 { 00:16:35.887 "name": "BaseBdev2", 00:16:35.887 "uuid": "f600c2b6-6851-482a-b35c-e8590acaa353", 00:16:35.887 "is_configured": true, 00:16:35.887 "data_offset": 2048, 00:16:35.887 "data_size": 63488 00:16:35.887 }, 00:16:35.887 { 00:16:35.887 "name": "BaseBdev3", 00:16:35.887 "uuid": "698068df-87dc-4b3d-9437-28a5d2f11142", 00:16:35.887 "is_configured": true, 00:16:35.887 "data_offset": 2048, 00:16:35.887 "data_size": 63488 00:16:35.887 } 00:16:35.887 ] 00:16:35.887 } 00:16:35.887 } 00:16:35.887 }' 00:16:35.887 10:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:35.887 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:35.887 BaseBdev2 00:16:35.887 BaseBdev3' 00:16:35.887 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:35.887 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:35.887 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.146 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.146 "name": "BaseBdev1", 00:16:36.146 "aliases": [ 00:16:36.146 "965bcd67-2085-42e7-9175-ae620f9ae0e9" 00:16:36.146 ], 00:16:36.146 "product_name": "Malloc disk", 00:16:36.146 "block_size": 512, 00:16:36.146 "num_blocks": 65536, 00:16:36.146 "uuid": "965bcd67-2085-42e7-9175-ae620f9ae0e9", 00:16:36.146 "assigned_rate_limits": { 00:16:36.146 "rw_ios_per_sec": 0, 00:16:36.146 "rw_mbytes_per_sec": 0, 00:16:36.146 "r_mbytes_per_sec": 0, 00:16:36.146 "w_mbytes_per_sec": 0 00:16:36.146 }, 00:16:36.146 "claimed": true, 00:16:36.146 "claim_type": "exclusive_write", 00:16:36.146 "zoned": false, 00:16:36.146 "supported_io_types": { 00:16:36.146 "read": true, 00:16:36.146 "write": true, 00:16:36.146 "unmap": true, 00:16:36.146 "flush": true, 00:16:36.146 "reset": true, 00:16:36.146 "nvme_admin": false, 00:16:36.146 "nvme_io": false, 00:16:36.146 "nvme_io_md": false, 00:16:36.146 "write_zeroes": true, 00:16:36.146 "zcopy": true, 00:16:36.146 "get_zone_info": false, 00:16:36.146 "zone_management": false, 00:16:36.146 "zone_append": false, 00:16:36.146 "compare": false, 00:16:36.146 "compare_and_write": false, 00:16:36.146 "abort": true, 00:16:36.146 "seek_hole": false, 00:16:36.146 "seek_data": false, 00:16:36.146 "copy": true, 00:16:36.146 "nvme_iov_md": false 00:16:36.146 }, 00:16:36.146 "memory_domains": [ 00:16:36.146 { 00:16:36.146 "dma_device_id": "system", 00:16:36.146 "dma_device_type": 1 00:16:36.146 }, 00:16:36.146 { 00:16:36.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.146 "dma_device_type": 2 00:16:36.146 } 00:16:36.146 ], 00:16:36.146 "driver_specific": {} 00:16:36.146 }' 00:16:36.146 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.146 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.146 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:36.146 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:36.404 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.662 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.662 "name": "BaseBdev2", 00:16:36.662 "aliases": [ 00:16:36.662 "f600c2b6-6851-482a-b35c-e8590acaa353" 00:16:36.662 ], 00:16:36.662 "product_name": "Malloc disk", 00:16:36.662 "block_size": 512, 00:16:36.662 "num_blocks": 65536, 00:16:36.662 "uuid": "f600c2b6-6851-482a-b35c-e8590acaa353", 00:16:36.662 "assigned_rate_limits": { 00:16:36.662 "rw_ios_per_sec": 0, 00:16:36.662 "rw_mbytes_per_sec": 0, 00:16:36.662 "r_mbytes_per_sec": 0, 00:16:36.662 "w_mbytes_per_sec": 0 00:16:36.662 }, 00:16:36.662 "claimed": true, 00:16:36.662 "claim_type": "exclusive_write", 00:16:36.662 "zoned": false, 00:16:36.662 "supported_io_types": { 00:16:36.662 "read": true, 00:16:36.662 "write": true, 00:16:36.662 "unmap": true, 00:16:36.662 "flush": true, 00:16:36.662 "reset": true, 00:16:36.662 "nvme_admin": false, 00:16:36.662 "nvme_io": false, 00:16:36.662 "nvme_io_md": false, 00:16:36.662 "write_zeroes": true, 00:16:36.662 "zcopy": true, 00:16:36.662 "get_zone_info": false, 00:16:36.662 "zone_management": false, 00:16:36.662 "zone_append": false, 00:16:36.662 "compare": false, 00:16:36.662 "compare_and_write": false, 00:16:36.662 "abort": true, 00:16:36.662 "seek_hole": false, 00:16:36.662 "seek_data": false, 00:16:36.662 "copy": true, 00:16:36.662 "nvme_iov_md": false 00:16:36.662 }, 00:16:36.662 "memory_domains": [ 00:16:36.662 { 00:16:36.662 "dma_device_id": "system", 00:16:36.662 "dma_device_type": 1 00:16:36.662 }, 00:16:36.662 { 00:16:36.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.662 "dma_device_type": 2 00:16:36.662 } 00:16:36.662 ], 00:16:36.662 "driver_specific": {} 00:16:36.662 }' 00:16:36.662 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.920 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.920 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:36.920 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.920 10:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.920 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:36.920 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.920 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.177 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.178 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.178 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.178 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.178 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.178 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:37.178 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:37.435 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:37.435 "name": "BaseBdev3", 00:16:37.435 "aliases": [ 00:16:37.435 "698068df-87dc-4b3d-9437-28a5d2f11142" 00:16:37.435 ], 00:16:37.435 "product_name": "Malloc disk", 00:16:37.435 "block_size": 512, 00:16:37.435 "num_blocks": 65536, 00:16:37.435 "uuid": "698068df-87dc-4b3d-9437-28a5d2f11142", 00:16:37.435 "assigned_rate_limits": { 00:16:37.435 "rw_ios_per_sec": 0, 00:16:37.435 "rw_mbytes_per_sec": 0, 00:16:37.435 "r_mbytes_per_sec": 0, 00:16:37.435 "w_mbytes_per_sec": 0 00:16:37.435 }, 00:16:37.435 "claimed": true, 00:16:37.435 "claim_type": "exclusive_write", 00:16:37.435 "zoned": false, 00:16:37.435 "supported_io_types": { 00:16:37.435 "read": true, 00:16:37.435 "write": true, 00:16:37.435 "unmap": true, 00:16:37.435 "flush": true, 00:16:37.435 "reset": true, 00:16:37.435 "nvme_admin": false, 00:16:37.435 "nvme_io": false, 00:16:37.435 "nvme_io_md": false, 00:16:37.435 "write_zeroes": true, 00:16:37.435 "zcopy": true, 00:16:37.435 "get_zone_info": false, 00:16:37.435 "zone_management": false, 00:16:37.435 "zone_append": false, 00:16:37.435 "compare": false, 00:16:37.435 "compare_and_write": false, 00:16:37.435 "abort": true, 00:16:37.435 "seek_hole": false, 00:16:37.435 "seek_data": false, 00:16:37.435 "copy": true, 00:16:37.435 "nvme_iov_md": false 00:16:37.435 }, 00:16:37.435 "memory_domains": [ 00:16:37.435 { 00:16:37.435 "dma_device_id": "system", 00:16:37.435 "dma_device_type": 1 00:16:37.435 }, 00:16:37.435 { 00:16:37.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.435 "dma_device_type": 2 00:16:37.435 } 00:16:37.435 ], 00:16:37.435 "driver_specific": {} 00:16:37.435 }' 00:16:37.435 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.435 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.435 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.435 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.435 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.435 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.435 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.694 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.694 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.694 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.694 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.694 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.694 10:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:37.954 [2024-07-12 10:43:13.029256] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.954 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:38.213 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:38.213 "name": "Existed_Raid", 00:16:38.213 "uuid": "317e4ccf-9906-4d2d-b385-c41b451d852c", 00:16:38.213 "strip_size_kb": 0, 00:16:38.213 "state": "online", 00:16:38.213 "raid_level": "raid1", 00:16:38.213 "superblock": true, 00:16:38.213 "num_base_bdevs": 3, 00:16:38.213 "num_base_bdevs_discovered": 2, 00:16:38.213 "num_base_bdevs_operational": 2, 00:16:38.213 "base_bdevs_list": [ 00:16:38.213 { 00:16:38.213 "name": null, 00:16:38.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:38.213 "is_configured": false, 00:16:38.213 "data_offset": 2048, 00:16:38.213 "data_size": 63488 00:16:38.213 }, 00:16:38.213 { 00:16:38.213 "name": "BaseBdev2", 00:16:38.213 "uuid": "f600c2b6-6851-482a-b35c-e8590acaa353", 00:16:38.213 "is_configured": true, 00:16:38.213 "data_offset": 2048, 00:16:38.213 "data_size": 63488 00:16:38.213 }, 00:16:38.213 { 00:16:38.213 "name": "BaseBdev3", 00:16:38.213 "uuid": "698068df-87dc-4b3d-9437-28a5d2f11142", 00:16:38.213 "is_configured": true, 00:16:38.213 "data_offset": 2048, 00:16:38.213 "data_size": 63488 00:16:38.213 } 00:16:38.213 ] 00:16:38.213 }' 00:16:38.213 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:38.213 10:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:38.779 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:38.779 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:38.779 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.779 10:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:39.038 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:39.038 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:39.038 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:39.376 [2024-07-12 10:43:14.370314] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:39.376 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:39.376 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:39.376 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.376 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:39.634 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:39.634 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:39.634 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:39.892 [2024-07-12 10:43:14.866261] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:39.892 [2024-07-12 10:43:14.866351] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:39.892 [2024-07-12 10:43:14.879022] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:39.892 [2024-07-12 10:43:14.879058] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:39.892 [2024-07-12 10:43:14.879069] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb09400 name Existed_Raid, state offline 00:16:39.892 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:39.892 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:39.892 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.892 10:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:40.151 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:40.151 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:40.151 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:40.151 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:40.151 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:40.151 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:40.410 BaseBdev2 00:16:40.410 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:40.410 10:43:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:40.410 10:43:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:40.410 10:43:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:40.410 10:43:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:40.410 10:43:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:40.410 10:43:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:40.669 10:43:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:40.927 [ 00:16:40.927 { 00:16:40.927 "name": "BaseBdev2", 00:16:40.927 "aliases": [ 00:16:40.927 "a02fca80-d07e-4ff1-b6d2-25f70f651eb9" 00:16:40.927 ], 00:16:40.927 "product_name": "Malloc disk", 00:16:40.927 "block_size": 512, 00:16:40.927 "num_blocks": 65536, 00:16:40.927 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:40.927 "assigned_rate_limits": { 00:16:40.927 "rw_ios_per_sec": 0, 00:16:40.927 "rw_mbytes_per_sec": 0, 00:16:40.927 "r_mbytes_per_sec": 0, 00:16:40.927 "w_mbytes_per_sec": 0 00:16:40.927 }, 00:16:40.927 "claimed": false, 00:16:40.927 "zoned": false, 00:16:40.927 "supported_io_types": { 00:16:40.927 "read": true, 00:16:40.927 "write": true, 00:16:40.927 "unmap": true, 00:16:40.927 "flush": true, 00:16:40.927 "reset": true, 00:16:40.927 "nvme_admin": false, 00:16:40.927 "nvme_io": false, 00:16:40.927 "nvme_io_md": false, 00:16:40.927 "write_zeroes": true, 00:16:40.927 "zcopy": true, 00:16:40.927 "get_zone_info": false, 00:16:40.927 "zone_management": false, 00:16:40.927 "zone_append": false, 00:16:40.927 "compare": false, 00:16:40.927 "compare_and_write": false, 00:16:40.927 "abort": true, 00:16:40.927 "seek_hole": false, 00:16:40.927 "seek_data": false, 00:16:40.927 "copy": true, 00:16:40.927 "nvme_iov_md": false 00:16:40.927 }, 00:16:40.927 "memory_domains": [ 00:16:40.927 { 00:16:40.927 "dma_device_id": "system", 00:16:40.927 "dma_device_type": 1 00:16:40.927 }, 00:16:40.927 { 00:16:40.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.927 "dma_device_type": 2 00:16:40.927 } 00:16:40.927 ], 00:16:40.927 "driver_specific": {} 00:16:40.927 } 00:16:40.927 ] 00:16:40.927 10:43:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:40.927 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:40.927 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:40.927 10:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:40.927 BaseBdev3 00:16:41.186 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:41.186 10:43:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:41.186 10:43:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:41.186 10:43:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:41.186 10:43:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:41.186 10:43:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:41.186 10:43:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:41.186 10:43:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:41.445 [ 00:16:41.445 { 00:16:41.445 "name": "BaseBdev3", 00:16:41.445 "aliases": [ 00:16:41.445 "f2fe0411-9c75-464e-b520-0ac25272e1d2" 00:16:41.445 ], 00:16:41.445 "product_name": "Malloc disk", 00:16:41.445 "block_size": 512, 00:16:41.445 "num_blocks": 65536, 00:16:41.445 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:41.445 "assigned_rate_limits": { 00:16:41.445 "rw_ios_per_sec": 0, 00:16:41.445 "rw_mbytes_per_sec": 0, 00:16:41.445 "r_mbytes_per_sec": 0, 00:16:41.445 "w_mbytes_per_sec": 0 00:16:41.445 }, 00:16:41.445 "claimed": false, 00:16:41.445 "zoned": false, 00:16:41.445 "supported_io_types": { 00:16:41.445 "read": true, 00:16:41.445 "write": true, 00:16:41.445 "unmap": true, 00:16:41.445 "flush": true, 00:16:41.445 "reset": true, 00:16:41.445 "nvme_admin": false, 00:16:41.445 "nvme_io": false, 00:16:41.445 "nvme_io_md": false, 00:16:41.445 "write_zeroes": true, 00:16:41.445 "zcopy": true, 00:16:41.445 "get_zone_info": false, 00:16:41.445 "zone_management": false, 00:16:41.445 "zone_append": false, 00:16:41.445 "compare": false, 00:16:41.445 "compare_and_write": false, 00:16:41.445 "abort": true, 00:16:41.445 "seek_hole": false, 00:16:41.445 "seek_data": false, 00:16:41.445 "copy": true, 00:16:41.445 "nvme_iov_md": false 00:16:41.445 }, 00:16:41.445 "memory_domains": [ 00:16:41.445 { 00:16:41.445 "dma_device_id": "system", 00:16:41.445 "dma_device_type": 1 00:16:41.445 }, 00:16:41.445 { 00:16:41.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.445 "dma_device_type": 2 00:16:41.445 } 00:16:41.445 ], 00:16:41.445 "driver_specific": {} 00:16:41.445 } 00:16:41.445 ] 00:16:41.445 10:43:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:41.445 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:41.445 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:41.445 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:41.703 [2024-07-12 10:43:16.740495] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:41.703 [2024-07-12 10:43:16.740537] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:41.703 [2024-07-12 10:43:16.740556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:41.703 [2024-07-12 10:43:16.741876] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:41.703 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:41.703 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.703 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:41.703 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:41.703 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:41.703 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:41.703 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.703 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.704 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.704 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.704 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.704 10:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.961 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.961 "name": "Existed_Raid", 00:16:41.961 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:41.961 "strip_size_kb": 0, 00:16:41.961 "state": "configuring", 00:16:41.961 "raid_level": "raid1", 00:16:41.961 "superblock": true, 00:16:41.961 "num_base_bdevs": 3, 00:16:41.961 "num_base_bdevs_discovered": 2, 00:16:41.961 "num_base_bdevs_operational": 3, 00:16:41.961 "base_bdevs_list": [ 00:16:41.961 { 00:16:41.961 "name": "BaseBdev1", 00:16:41.961 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:41.961 "is_configured": false, 00:16:41.961 "data_offset": 0, 00:16:41.961 "data_size": 0 00:16:41.961 }, 00:16:41.961 { 00:16:41.961 "name": "BaseBdev2", 00:16:41.961 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:41.961 "is_configured": true, 00:16:41.961 "data_offset": 2048, 00:16:41.961 "data_size": 63488 00:16:41.961 }, 00:16:41.961 { 00:16:41.961 "name": "BaseBdev3", 00:16:41.961 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:41.961 "is_configured": true, 00:16:41.961 "data_offset": 2048, 00:16:41.961 "data_size": 63488 00:16:41.961 } 00:16:41.961 ] 00:16:41.961 }' 00:16:41.961 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.961 10:43:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:42.526 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:42.783 [2024-07-12 10:43:17.811300] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.783 10:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.041 10:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.041 "name": "Existed_Raid", 00:16:43.041 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:43.041 "strip_size_kb": 0, 00:16:43.041 "state": "configuring", 00:16:43.041 "raid_level": "raid1", 00:16:43.041 "superblock": true, 00:16:43.041 "num_base_bdevs": 3, 00:16:43.041 "num_base_bdevs_discovered": 1, 00:16:43.041 "num_base_bdevs_operational": 3, 00:16:43.041 "base_bdevs_list": [ 00:16:43.041 { 00:16:43.041 "name": "BaseBdev1", 00:16:43.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.041 "is_configured": false, 00:16:43.041 "data_offset": 0, 00:16:43.041 "data_size": 0 00:16:43.041 }, 00:16:43.041 { 00:16:43.041 "name": null, 00:16:43.041 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:43.041 "is_configured": false, 00:16:43.041 "data_offset": 2048, 00:16:43.041 "data_size": 63488 00:16:43.041 }, 00:16:43.041 { 00:16:43.041 "name": "BaseBdev3", 00:16:43.041 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:43.041 "is_configured": true, 00:16:43.041 "data_offset": 2048, 00:16:43.041 "data_size": 63488 00:16:43.041 } 00:16:43.041 ] 00:16:43.041 }' 00:16:43.041 10:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.041 10:43:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.605 10:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:43.605 10:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.863 10:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:43.863 10:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:44.120 [2024-07-12 10:43:19.147405] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:44.120 BaseBdev1 00:16:44.120 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:44.120 10:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:44.120 10:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:44.120 10:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:44.120 10:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:44.120 10:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:44.120 10:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:44.378 10:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:44.636 [ 00:16:44.636 { 00:16:44.636 "name": "BaseBdev1", 00:16:44.636 "aliases": [ 00:16:44.636 "200d124c-1371-4993-9942-0f63a9df68a1" 00:16:44.636 ], 00:16:44.636 "product_name": "Malloc disk", 00:16:44.636 "block_size": 512, 00:16:44.636 "num_blocks": 65536, 00:16:44.636 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:44.636 "assigned_rate_limits": { 00:16:44.636 "rw_ios_per_sec": 0, 00:16:44.636 "rw_mbytes_per_sec": 0, 00:16:44.636 "r_mbytes_per_sec": 0, 00:16:44.636 "w_mbytes_per_sec": 0 00:16:44.636 }, 00:16:44.636 "claimed": true, 00:16:44.636 "claim_type": "exclusive_write", 00:16:44.636 "zoned": false, 00:16:44.636 "supported_io_types": { 00:16:44.636 "read": true, 00:16:44.636 "write": true, 00:16:44.636 "unmap": true, 00:16:44.636 "flush": true, 00:16:44.636 "reset": true, 00:16:44.636 "nvme_admin": false, 00:16:44.636 "nvme_io": false, 00:16:44.636 "nvme_io_md": false, 00:16:44.636 "write_zeroes": true, 00:16:44.636 "zcopy": true, 00:16:44.636 "get_zone_info": false, 00:16:44.636 "zone_management": false, 00:16:44.636 "zone_append": false, 00:16:44.636 "compare": false, 00:16:44.636 "compare_and_write": false, 00:16:44.636 "abort": true, 00:16:44.636 "seek_hole": false, 00:16:44.636 "seek_data": false, 00:16:44.636 "copy": true, 00:16:44.636 "nvme_iov_md": false 00:16:44.636 }, 00:16:44.636 "memory_domains": [ 00:16:44.636 { 00:16:44.636 "dma_device_id": "system", 00:16:44.636 "dma_device_type": 1 00:16:44.636 }, 00:16:44.636 { 00:16:44.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.636 "dma_device_type": 2 00:16:44.636 } 00:16:44.636 ], 00:16:44.636 "driver_specific": {} 00:16:44.636 } 00:16:44.636 ] 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.636 "name": "Existed_Raid", 00:16:44.636 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:44.636 "strip_size_kb": 0, 00:16:44.636 "state": "configuring", 00:16:44.636 "raid_level": "raid1", 00:16:44.636 "superblock": true, 00:16:44.636 "num_base_bdevs": 3, 00:16:44.636 "num_base_bdevs_discovered": 2, 00:16:44.636 "num_base_bdevs_operational": 3, 00:16:44.636 "base_bdevs_list": [ 00:16:44.636 { 00:16:44.636 "name": "BaseBdev1", 00:16:44.636 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:44.636 "is_configured": true, 00:16:44.636 "data_offset": 2048, 00:16:44.636 "data_size": 63488 00:16:44.636 }, 00:16:44.636 { 00:16:44.636 "name": null, 00:16:44.636 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:44.636 "is_configured": false, 00:16:44.636 "data_offset": 2048, 00:16:44.636 "data_size": 63488 00:16:44.636 }, 00:16:44.636 { 00:16:44.636 "name": "BaseBdev3", 00:16:44.636 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:44.636 "is_configured": true, 00:16:44.636 "data_offset": 2048, 00:16:44.636 "data_size": 63488 00:16:44.636 } 00:16:44.636 ] 00:16:44.636 }' 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.636 10:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.201 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.201 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:45.458 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:45.458 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:45.716 [2024-07-12 10:43:20.816066] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.716 10:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.973 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.973 "name": "Existed_Raid", 00:16:45.973 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:45.973 "strip_size_kb": 0, 00:16:45.973 "state": "configuring", 00:16:45.973 "raid_level": "raid1", 00:16:45.973 "superblock": true, 00:16:45.973 "num_base_bdevs": 3, 00:16:45.973 "num_base_bdevs_discovered": 1, 00:16:45.973 "num_base_bdevs_operational": 3, 00:16:45.973 "base_bdevs_list": [ 00:16:45.973 { 00:16:45.973 "name": "BaseBdev1", 00:16:45.973 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:45.973 "is_configured": true, 00:16:45.973 "data_offset": 2048, 00:16:45.973 "data_size": 63488 00:16:45.973 }, 00:16:45.973 { 00:16:45.973 "name": null, 00:16:45.973 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:45.973 "is_configured": false, 00:16:45.973 "data_offset": 2048, 00:16:45.973 "data_size": 63488 00:16:45.973 }, 00:16:45.973 { 00:16:45.973 "name": null, 00:16:45.973 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:45.973 "is_configured": false, 00:16:45.973 "data_offset": 2048, 00:16:45.973 "data_size": 63488 00:16:45.973 } 00:16:45.973 ] 00:16:45.973 }' 00:16:45.973 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.973 10:43:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:46.537 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.537 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:46.794 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:46.794 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:46.794 [2024-07-12 10:43:21.975164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.051 10:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.051 10:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.051 "name": "Existed_Raid", 00:16:47.051 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:47.051 "strip_size_kb": 0, 00:16:47.051 "state": "configuring", 00:16:47.051 "raid_level": "raid1", 00:16:47.051 "superblock": true, 00:16:47.051 "num_base_bdevs": 3, 00:16:47.051 "num_base_bdevs_discovered": 2, 00:16:47.051 "num_base_bdevs_operational": 3, 00:16:47.051 "base_bdevs_list": [ 00:16:47.051 { 00:16:47.051 "name": "BaseBdev1", 00:16:47.051 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:47.051 "is_configured": true, 00:16:47.051 "data_offset": 2048, 00:16:47.051 "data_size": 63488 00:16:47.051 }, 00:16:47.051 { 00:16:47.051 "name": null, 00:16:47.051 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:47.051 "is_configured": false, 00:16:47.051 "data_offset": 2048, 00:16:47.051 "data_size": 63488 00:16:47.051 }, 00:16:47.051 { 00:16:47.051 "name": "BaseBdev3", 00:16:47.051 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:47.051 "is_configured": true, 00:16:47.051 "data_offset": 2048, 00:16:47.051 "data_size": 63488 00:16:47.051 } 00:16:47.051 ] 00:16:47.051 }' 00:16:47.051 10:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.051 10:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.987 10:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.987 10:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:47.987 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:47.987 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:48.246 [2024-07-12 10:43:23.294691] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.246 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.811 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.811 "name": "Existed_Raid", 00:16:48.811 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:48.811 "strip_size_kb": 0, 00:16:48.811 "state": "configuring", 00:16:48.811 "raid_level": "raid1", 00:16:48.811 "superblock": true, 00:16:48.811 "num_base_bdevs": 3, 00:16:48.811 "num_base_bdevs_discovered": 1, 00:16:48.811 "num_base_bdevs_operational": 3, 00:16:48.811 "base_bdevs_list": [ 00:16:48.811 { 00:16:48.811 "name": null, 00:16:48.811 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:48.811 "is_configured": false, 00:16:48.811 "data_offset": 2048, 00:16:48.811 "data_size": 63488 00:16:48.811 }, 00:16:48.811 { 00:16:48.811 "name": null, 00:16:48.811 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:48.811 "is_configured": false, 00:16:48.811 "data_offset": 2048, 00:16:48.811 "data_size": 63488 00:16:48.811 }, 00:16:48.811 { 00:16:48.811 "name": "BaseBdev3", 00:16:48.811 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:48.811 "is_configured": true, 00:16:48.811 "data_offset": 2048, 00:16:48.811 "data_size": 63488 00:16:48.811 } 00:16:48.811 ] 00:16:48.811 }' 00:16:48.811 10:43:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.811 10:43:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:49.378 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.378 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:49.635 [2024-07-12 10:43:24.753198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.635 10:43:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.893 10:43:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.893 "name": "Existed_Raid", 00:16:49.893 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:49.893 "strip_size_kb": 0, 00:16:49.893 "state": "configuring", 00:16:49.893 "raid_level": "raid1", 00:16:49.893 "superblock": true, 00:16:49.893 "num_base_bdevs": 3, 00:16:49.893 "num_base_bdevs_discovered": 2, 00:16:49.893 "num_base_bdevs_operational": 3, 00:16:49.893 "base_bdevs_list": [ 00:16:49.893 { 00:16:49.893 "name": null, 00:16:49.893 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:49.893 "is_configured": false, 00:16:49.893 "data_offset": 2048, 00:16:49.893 "data_size": 63488 00:16:49.893 }, 00:16:49.893 { 00:16:49.893 "name": "BaseBdev2", 00:16:49.893 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:49.893 "is_configured": true, 00:16:49.893 "data_offset": 2048, 00:16:49.893 "data_size": 63488 00:16:49.893 }, 00:16:49.893 { 00:16:49.893 "name": "BaseBdev3", 00:16:49.893 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:49.893 "is_configured": true, 00:16:49.893 "data_offset": 2048, 00:16:49.893 "data_size": 63488 00:16:49.893 } 00:16:49.893 ] 00:16:49.893 }' 00:16:49.893 10:43:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.893 10:43:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:50.459 10:43:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.459 10:43:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:50.716 10:43:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:50.716 10:43:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.716 10:43:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:50.974 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 200d124c-1371-4993-9942-0f63a9df68a1 00:16:51.231 [2024-07-12 10:43:26.293830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:51.231 [2024-07-12 10:43:26.293989] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaff1b0 00:16:51.231 [2024-07-12 10:43:26.294003] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:51.231 [2024-07-12 10:43:26.294182] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcbb4f0 00:16:51.231 [2024-07-12 10:43:26.294303] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaff1b0 00:16:51.231 [2024-07-12 10:43:26.294313] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xaff1b0 00:16:51.231 [2024-07-12 10:43:26.294407] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:51.231 NewBaseBdev 00:16:51.231 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:51.231 10:43:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:51.231 10:43:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:51.231 10:43:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:51.231 10:43:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:51.231 10:43:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:51.231 10:43:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:51.489 10:43:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:51.747 [ 00:16:51.747 { 00:16:51.747 "name": "NewBaseBdev", 00:16:51.747 "aliases": [ 00:16:51.747 "200d124c-1371-4993-9942-0f63a9df68a1" 00:16:51.747 ], 00:16:51.747 "product_name": "Malloc disk", 00:16:51.747 "block_size": 512, 00:16:51.747 "num_blocks": 65536, 00:16:51.747 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:51.747 "assigned_rate_limits": { 00:16:51.747 "rw_ios_per_sec": 0, 00:16:51.747 "rw_mbytes_per_sec": 0, 00:16:51.747 "r_mbytes_per_sec": 0, 00:16:51.747 "w_mbytes_per_sec": 0 00:16:51.747 }, 00:16:51.747 "claimed": true, 00:16:51.747 "claim_type": "exclusive_write", 00:16:51.748 "zoned": false, 00:16:51.748 "supported_io_types": { 00:16:51.748 "read": true, 00:16:51.748 "write": true, 00:16:51.748 "unmap": true, 00:16:51.748 "flush": true, 00:16:51.748 "reset": true, 00:16:51.748 "nvme_admin": false, 00:16:51.748 "nvme_io": false, 00:16:51.748 "nvme_io_md": false, 00:16:51.748 "write_zeroes": true, 00:16:51.748 "zcopy": true, 00:16:51.748 "get_zone_info": false, 00:16:51.748 "zone_management": false, 00:16:51.748 "zone_append": false, 00:16:51.748 "compare": false, 00:16:51.748 "compare_and_write": false, 00:16:51.748 "abort": true, 00:16:51.748 "seek_hole": false, 00:16:51.748 "seek_data": false, 00:16:51.748 "copy": true, 00:16:51.748 "nvme_iov_md": false 00:16:51.748 }, 00:16:51.748 "memory_domains": [ 00:16:51.748 { 00:16:51.748 "dma_device_id": "system", 00:16:51.748 "dma_device_type": 1 00:16:51.748 }, 00:16:51.748 { 00:16:51.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.748 "dma_device_type": 2 00:16:51.748 } 00:16:51.748 ], 00:16:51.748 "driver_specific": {} 00:16:51.748 } 00:16:51.748 ] 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.748 10:43:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.006 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.006 "name": "Existed_Raid", 00:16:52.006 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:52.006 "strip_size_kb": 0, 00:16:52.006 "state": "online", 00:16:52.006 "raid_level": "raid1", 00:16:52.006 "superblock": true, 00:16:52.006 "num_base_bdevs": 3, 00:16:52.006 "num_base_bdevs_discovered": 3, 00:16:52.006 "num_base_bdevs_operational": 3, 00:16:52.006 "base_bdevs_list": [ 00:16:52.006 { 00:16:52.006 "name": "NewBaseBdev", 00:16:52.006 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:52.006 "is_configured": true, 00:16:52.006 "data_offset": 2048, 00:16:52.006 "data_size": 63488 00:16:52.006 }, 00:16:52.006 { 00:16:52.006 "name": "BaseBdev2", 00:16:52.006 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:52.006 "is_configured": true, 00:16:52.006 "data_offset": 2048, 00:16:52.006 "data_size": 63488 00:16:52.006 }, 00:16:52.006 { 00:16:52.006 "name": "BaseBdev3", 00:16:52.006 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:52.006 "is_configured": true, 00:16:52.006 "data_offset": 2048, 00:16:52.006 "data_size": 63488 00:16:52.006 } 00:16:52.006 ] 00:16:52.006 }' 00:16:52.006 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.006 10:43:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:52.572 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:52.572 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:52.572 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:52.572 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:52.572 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:52.572 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:52.572 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:52.572 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:52.830 [2024-07-12 10:43:27.858281] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:52.830 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:52.830 "name": "Existed_Raid", 00:16:52.830 "aliases": [ 00:16:52.830 "505019be-c4d0-4632-b727-8dc7cc5eff53" 00:16:52.830 ], 00:16:52.830 "product_name": "Raid Volume", 00:16:52.830 "block_size": 512, 00:16:52.830 "num_blocks": 63488, 00:16:52.830 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:52.830 "assigned_rate_limits": { 00:16:52.830 "rw_ios_per_sec": 0, 00:16:52.830 "rw_mbytes_per_sec": 0, 00:16:52.830 "r_mbytes_per_sec": 0, 00:16:52.830 "w_mbytes_per_sec": 0 00:16:52.830 }, 00:16:52.830 "claimed": false, 00:16:52.830 "zoned": false, 00:16:52.830 "supported_io_types": { 00:16:52.830 "read": true, 00:16:52.830 "write": true, 00:16:52.830 "unmap": false, 00:16:52.830 "flush": false, 00:16:52.830 "reset": true, 00:16:52.830 "nvme_admin": false, 00:16:52.830 "nvme_io": false, 00:16:52.830 "nvme_io_md": false, 00:16:52.830 "write_zeroes": true, 00:16:52.830 "zcopy": false, 00:16:52.830 "get_zone_info": false, 00:16:52.830 "zone_management": false, 00:16:52.830 "zone_append": false, 00:16:52.830 "compare": false, 00:16:52.830 "compare_and_write": false, 00:16:52.830 "abort": false, 00:16:52.830 "seek_hole": false, 00:16:52.830 "seek_data": false, 00:16:52.830 "copy": false, 00:16:52.830 "nvme_iov_md": false 00:16:52.830 }, 00:16:52.830 "memory_domains": [ 00:16:52.830 { 00:16:52.830 "dma_device_id": "system", 00:16:52.830 "dma_device_type": 1 00:16:52.830 }, 00:16:52.830 { 00:16:52.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.830 "dma_device_type": 2 00:16:52.830 }, 00:16:52.830 { 00:16:52.830 "dma_device_id": "system", 00:16:52.830 "dma_device_type": 1 00:16:52.830 }, 00:16:52.830 { 00:16:52.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.830 "dma_device_type": 2 00:16:52.830 }, 00:16:52.830 { 00:16:52.830 "dma_device_id": "system", 00:16:52.830 "dma_device_type": 1 00:16:52.830 }, 00:16:52.830 { 00:16:52.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.830 "dma_device_type": 2 00:16:52.830 } 00:16:52.830 ], 00:16:52.830 "driver_specific": { 00:16:52.830 "raid": { 00:16:52.830 "uuid": "505019be-c4d0-4632-b727-8dc7cc5eff53", 00:16:52.830 "strip_size_kb": 0, 00:16:52.830 "state": "online", 00:16:52.830 "raid_level": "raid1", 00:16:52.830 "superblock": true, 00:16:52.830 "num_base_bdevs": 3, 00:16:52.830 "num_base_bdevs_discovered": 3, 00:16:52.830 "num_base_bdevs_operational": 3, 00:16:52.830 "base_bdevs_list": [ 00:16:52.830 { 00:16:52.830 "name": "NewBaseBdev", 00:16:52.830 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:52.830 "is_configured": true, 00:16:52.830 "data_offset": 2048, 00:16:52.830 "data_size": 63488 00:16:52.830 }, 00:16:52.830 { 00:16:52.830 "name": "BaseBdev2", 00:16:52.830 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:52.830 "is_configured": true, 00:16:52.830 "data_offset": 2048, 00:16:52.830 "data_size": 63488 00:16:52.830 }, 00:16:52.830 { 00:16:52.830 "name": "BaseBdev3", 00:16:52.830 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:52.830 "is_configured": true, 00:16:52.830 "data_offset": 2048, 00:16:52.830 "data_size": 63488 00:16:52.830 } 00:16:52.830 ] 00:16:52.830 } 00:16:52.830 } 00:16:52.830 }' 00:16:52.830 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:52.830 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:52.830 BaseBdev2 00:16:52.830 BaseBdev3' 00:16:52.830 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.830 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:52.830 10:43:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.088 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.088 "name": "NewBaseBdev", 00:16:53.088 "aliases": [ 00:16:53.088 "200d124c-1371-4993-9942-0f63a9df68a1" 00:16:53.088 ], 00:16:53.088 "product_name": "Malloc disk", 00:16:53.088 "block_size": 512, 00:16:53.088 "num_blocks": 65536, 00:16:53.088 "uuid": "200d124c-1371-4993-9942-0f63a9df68a1", 00:16:53.088 "assigned_rate_limits": { 00:16:53.088 "rw_ios_per_sec": 0, 00:16:53.088 "rw_mbytes_per_sec": 0, 00:16:53.088 "r_mbytes_per_sec": 0, 00:16:53.088 "w_mbytes_per_sec": 0 00:16:53.088 }, 00:16:53.088 "claimed": true, 00:16:53.088 "claim_type": "exclusive_write", 00:16:53.088 "zoned": false, 00:16:53.088 "supported_io_types": { 00:16:53.088 "read": true, 00:16:53.088 "write": true, 00:16:53.088 "unmap": true, 00:16:53.088 "flush": true, 00:16:53.088 "reset": true, 00:16:53.088 "nvme_admin": false, 00:16:53.088 "nvme_io": false, 00:16:53.088 "nvme_io_md": false, 00:16:53.088 "write_zeroes": true, 00:16:53.088 "zcopy": true, 00:16:53.088 "get_zone_info": false, 00:16:53.088 "zone_management": false, 00:16:53.088 "zone_append": false, 00:16:53.088 "compare": false, 00:16:53.088 "compare_and_write": false, 00:16:53.088 "abort": true, 00:16:53.088 "seek_hole": false, 00:16:53.088 "seek_data": false, 00:16:53.088 "copy": true, 00:16:53.088 "nvme_iov_md": false 00:16:53.088 }, 00:16:53.088 "memory_domains": [ 00:16:53.088 { 00:16:53.088 "dma_device_id": "system", 00:16:53.088 "dma_device_type": 1 00:16:53.088 }, 00:16:53.088 { 00:16:53.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.088 "dma_device_type": 2 00:16:53.088 } 00:16:53.088 ], 00:16:53.088 "driver_specific": {} 00:16:53.088 }' 00:16:53.088 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.088 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.088 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.088 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.346 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.346 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.346 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.346 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.346 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.346 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.347 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.347 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.347 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.347 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:53.347 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.605 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.605 "name": "BaseBdev2", 00:16:53.605 "aliases": [ 00:16:53.605 "a02fca80-d07e-4ff1-b6d2-25f70f651eb9" 00:16:53.605 ], 00:16:53.605 "product_name": "Malloc disk", 00:16:53.605 "block_size": 512, 00:16:53.605 "num_blocks": 65536, 00:16:53.605 "uuid": "a02fca80-d07e-4ff1-b6d2-25f70f651eb9", 00:16:53.605 "assigned_rate_limits": { 00:16:53.605 "rw_ios_per_sec": 0, 00:16:53.605 "rw_mbytes_per_sec": 0, 00:16:53.605 "r_mbytes_per_sec": 0, 00:16:53.605 "w_mbytes_per_sec": 0 00:16:53.605 }, 00:16:53.605 "claimed": true, 00:16:53.605 "claim_type": "exclusive_write", 00:16:53.605 "zoned": false, 00:16:53.605 "supported_io_types": { 00:16:53.605 "read": true, 00:16:53.605 "write": true, 00:16:53.605 "unmap": true, 00:16:53.605 "flush": true, 00:16:53.605 "reset": true, 00:16:53.605 "nvme_admin": false, 00:16:53.605 "nvme_io": false, 00:16:53.605 "nvme_io_md": false, 00:16:53.605 "write_zeroes": true, 00:16:53.605 "zcopy": true, 00:16:53.605 "get_zone_info": false, 00:16:53.605 "zone_management": false, 00:16:53.605 "zone_append": false, 00:16:53.605 "compare": false, 00:16:53.605 "compare_and_write": false, 00:16:53.605 "abort": true, 00:16:53.605 "seek_hole": false, 00:16:53.605 "seek_data": false, 00:16:53.605 "copy": true, 00:16:53.605 "nvme_iov_md": false 00:16:53.605 }, 00:16:53.605 "memory_domains": [ 00:16:53.605 { 00:16:53.605 "dma_device_id": "system", 00:16:53.605 "dma_device_type": 1 00:16:53.605 }, 00:16:53.605 { 00:16:53.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.605 "dma_device_type": 2 00:16:53.605 } 00:16:53.605 ], 00:16:53.605 "driver_specific": {} 00:16:53.605 }' 00:16:53.605 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.862 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.862 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.862 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.862 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.862 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.862 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.862 10:43:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.862 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.862 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.120 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.120 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.120 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.120 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:54.120 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.379 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.379 "name": "BaseBdev3", 00:16:54.379 "aliases": [ 00:16:54.379 "f2fe0411-9c75-464e-b520-0ac25272e1d2" 00:16:54.379 ], 00:16:54.379 "product_name": "Malloc disk", 00:16:54.379 "block_size": 512, 00:16:54.379 "num_blocks": 65536, 00:16:54.379 "uuid": "f2fe0411-9c75-464e-b520-0ac25272e1d2", 00:16:54.379 "assigned_rate_limits": { 00:16:54.379 "rw_ios_per_sec": 0, 00:16:54.379 "rw_mbytes_per_sec": 0, 00:16:54.379 "r_mbytes_per_sec": 0, 00:16:54.379 "w_mbytes_per_sec": 0 00:16:54.379 }, 00:16:54.379 "claimed": true, 00:16:54.379 "claim_type": "exclusive_write", 00:16:54.379 "zoned": false, 00:16:54.379 "supported_io_types": { 00:16:54.379 "read": true, 00:16:54.379 "write": true, 00:16:54.379 "unmap": true, 00:16:54.379 "flush": true, 00:16:54.379 "reset": true, 00:16:54.379 "nvme_admin": false, 00:16:54.379 "nvme_io": false, 00:16:54.379 "nvme_io_md": false, 00:16:54.379 "write_zeroes": true, 00:16:54.379 "zcopy": true, 00:16:54.379 "get_zone_info": false, 00:16:54.379 "zone_management": false, 00:16:54.379 "zone_append": false, 00:16:54.379 "compare": false, 00:16:54.379 "compare_and_write": false, 00:16:54.379 "abort": true, 00:16:54.379 "seek_hole": false, 00:16:54.379 "seek_data": false, 00:16:54.379 "copy": true, 00:16:54.379 "nvme_iov_md": false 00:16:54.379 }, 00:16:54.379 "memory_domains": [ 00:16:54.379 { 00:16:54.379 "dma_device_id": "system", 00:16:54.379 "dma_device_type": 1 00:16:54.379 }, 00:16:54.379 { 00:16:54.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.379 "dma_device_type": 2 00:16:54.379 } 00:16:54.379 ], 00:16:54.379 "driver_specific": {} 00:16:54.379 }' 00:16:54.379 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.379 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.379 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.379 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.379 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.379 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.379 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.637 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.637 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.637 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.637 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.637 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.637 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:54.895 [2024-07-12 10:43:29.931530] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:54.895 [2024-07-12 10:43:29.931556] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:54.895 [2024-07-12 10:43:29.931611] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:54.895 [2024-07-12 10:43:29.931884] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:54.895 [2024-07-12 10:43:29.931896] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaff1b0 name Existed_Raid, state offline 00:16:54.895 10:43:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2066298 00:16:54.895 10:43:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2066298 ']' 00:16:54.895 10:43:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2066298 00:16:54.895 10:43:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:54.895 10:43:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:54.895 10:43:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2066298 00:16:54.895 10:43:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:54.895 10:43:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:54.895 10:43:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2066298' 00:16:54.895 killing process with pid 2066298 00:16:54.895 10:43:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2066298 00:16:54.895 [2024-07-12 10:43:30.002828] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:54.895 10:43:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2066298 00:16:54.895 [2024-07-12 10:43:30.030820] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:55.153 10:43:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:55.153 00:16:55.153 real 0m28.065s 00:16:55.153 user 0m51.411s 00:16:55.153 sys 0m5.150s 00:16:55.153 10:43:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:55.153 10:43:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:55.153 ************************************ 00:16:55.153 END TEST raid_state_function_test_sb 00:16:55.153 ************************************ 00:16:55.153 10:43:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:55.153 10:43:30 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:16:55.153 10:43:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:55.153 10:43:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:55.153 10:43:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:55.153 ************************************ 00:16:55.153 START TEST raid_superblock_test 00:16:55.153 ************************************ 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2070574 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2070574 /var/tmp/spdk-raid.sock 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2070574 ']' 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:55.153 10:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:55.411 10:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:55.411 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:55.411 10:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:55.411 10:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.411 [2024-07-12 10:43:30.405055] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:16:55.411 [2024-07-12 10:43:30.405126] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2070574 ] 00:16:55.411 [2024-07-12 10:43:30.535593] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:55.695 [2024-07-12 10:43:30.639008] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:55.695 [2024-07-12 10:43:30.696060] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:55.695 [2024-07-12 10:43:30.696087] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:56.266 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:56.523 malloc1 00:16:56.523 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:56.779 [2024-07-12 10:43:31.749170] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:56.779 [2024-07-12 10:43:31.749221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:56.779 [2024-07-12 10:43:31.749243] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc8d570 00:16:56.779 [2024-07-12 10:43:31.749256] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:56.779 [2024-07-12 10:43:31.750928] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:56.779 [2024-07-12 10:43:31.750958] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:56.779 pt1 00:16:56.779 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:56.779 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:56.779 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:56.779 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:56.779 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:56.779 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:56.779 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:56.780 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:56.780 10:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:57.036 malloc2 00:16:57.036 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:57.292 [2024-07-12 10:43:32.243339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:57.292 [2024-07-12 10:43:32.243391] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.292 [2024-07-12 10:43:32.243408] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc8e970 00:16:57.292 [2024-07-12 10:43:32.243420] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.292 [2024-07-12 10:43:32.244944] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.292 [2024-07-12 10:43:32.244978] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:57.292 pt2 00:16:57.292 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:57.292 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:57.292 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:57.292 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:57.292 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:57.293 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:57.293 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:57.293 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:57.293 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:57.550 malloc3 00:16:57.550 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:57.550 [2024-07-12 10:43:32.737480] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:57.550 [2024-07-12 10:43:32.737536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.550 [2024-07-12 10:43:32.737555] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe25340 00:16:57.550 [2024-07-12 10:43:32.737569] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.550 [2024-07-12 10:43:32.739002] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.550 [2024-07-12 10:43:32.739033] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:57.550 pt3 00:16:57.807 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:57.807 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:57.808 10:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:57.808 [2024-07-12 10:43:32.982143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:57.808 [2024-07-12 10:43:32.983441] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:57.808 [2024-07-12 10:43:32.983503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:57.808 [2024-07-12 10:43:32.983656] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc85ea0 00:16:57.808 [2024-07-12 10:43:32.983667] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:57.808 [2024-07-12 10:43:32.983869] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc8d240 00:16:57.808 [2024-07-12 10:43:32.984016] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc85ea0 00:16:57.808 [2024-07-12 10:43:32.984027] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc85ea0 00:16:57.808 [2024-07-12 10:43:32.984124] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.065 "name": "raid_bdev1", 00:16:58.065 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:16:58.065 "strip_size_kb": 0, 00:16:58.065 "state": "online", 00:16:58.065 "raid_level": "raid1", 00:16:58.065 "superblock": true, 00:16:58.065 "num_base_bdevs": 3, 00:16:58.065 "num_base_bdevs_discovered": 3, 00:16:58.065 "num_base_bdevs_operational": 3, 00:16:58.065 "base_bdevs_list": [ 00:16:58.065 { 00:16:58.065 "name": "pt1", 00:16:58.065 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.065 "is_configured": true, 00:16:58.065 "data_offset": 2048, 00:16:58.065 "data_size": 63488 00:16:58.065 }, 00:16:58.065 { 00:16:58.065 "name": "pt2", 00:16:58.065 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.065 "is_configured": true, 00:16:58.065 "data_offset": 2048, 00:16:58.065 "data_size": 63488 00:16:58.065 }, 00:16:58.065 { 00:16:58.065 "name": "pt3", 00:16:58.065 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.065 "is_configured": true, 00:16:58.065 "data_offset": 2048, 00:16:58.065 "data_size": 63488 00:16:58.065 } 00:16:58.065 ] 00:16:58.065 }' 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.065 10:43:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.999 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:58.999 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:58.999 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:58.999 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:58.999 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:58.999 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:58.999 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:58.999 10:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:58.999 [2024-07-12 10:43:34.085308] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:58.999 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:58.999 "name": "raid_bdev1", 00:16:58.999 "aliases": [ 00:16:58.999 "ac8013c0-a705-41f5-9ede-eae626c074b9" 00:16:58.999 ], 00:16:58.999 "product_name": "Raid Volume", 00:16:58.999 "block_size": 512, 00:16:58.999 "num_blocks": 63488, 00:16:58.999 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:16:58.999 "assigned_rate_limits": { 00:16:58.999 "rw_ios_per_sec": 0, 00:16:58.999 "rw_mbytes_per_sec": 0, 00:16:58.999 "r_mbytes_per_sec": 0, 00:16:58.999 "w_mbytes_per_sec": 0 00:16:58.999 }, 00:16:58.999 "claimed": false, 00:16:58.999 "zoned": false, 00:16:58.999 "supported_io_types": { 00:16:58.999 "read": true, 00:16:58.999 "write": true, 00:16:58.999 "unmap": false, 00:16:58.999 "flush": false, 00:16:58.999 "reset": true, 00:16:58.999 "nvme_admin": false, 00:16:58.999 "nvme_io": false, 00:16:58.999 "nvme_io_md": false, 00:16:58.999 "write_zeroes": true, 00:16:58.999 "zcopy": false, 00:16:58.999 "get_zone_info": false, 00:16:58.999 "zone_management": false, 00:16:58.999 "zone_append": false, 00:16:58.999 "compare": false, 00:16:58.999 "compare_and_write": false, 00:16:58.999 "abort": false, 00:16:58.999 "seek_hole": false, 00:16:58.999 "seek_data": false, 00:16:58.999 "copy": false, 00:16:58.999 "nvme_iov_md": false 00:16:58.999 }, 00:16:58.999 "memory_domains": [ 00:16:58.999 { 00:16:58.999 "dma_device_id": "system", 00:16:58.999 "dma_device_type": 1 00:16:58.999 }, 00:16:58.999 { 00:16:58.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.999 "dma_device_type": 2 00:16:58.999 }, 00:16:58.999 { 00:16:58.999 "dma_device_id": "system", 00:16:58.999 "dma_device_type": 1 00:16:58.999 }, 00:16:58.999 { 00:16:58.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.999 "dma_device_type": 2 00:16:58.999 }, 00:16:58.999 { 00:16:58.999 "dma_device_id": "system", 00:16:58.999 "dma_device_type": 1 00:16:58.999 }, 00:16:58.999 { 00:16:58.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.999 "dma_device_type": 2 00:16:58.999 } 00:16:58.999 ], 00:16:58.999 "driver_specific": { 00:16:58.999 "raid": { 00:16:58.999 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:16:58.999 "strip_size_kb": 0, 00:16:58.999 "state": "online", 00:16:58.999 "raid_level": "raid1", 00:16:58.999 "superblock": true, 00:16:58.999 "num_base_bdevs": 3, 00:16:58.999 "num_base_bdevs_discovered": 3, 00:16:58.999 "num_base_bdevs_operational": 3, 00:16:58.999 "base_bdevs_list": [ 00:16:58.999 { 00:16:58.999 "name": "pt1", 00:16:58.999 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.999 "is_configured": true, 00:16:58.999 "data_offset": 2048, 00:16:58.999 "data_size": 63488 00:16:58.999 }, 00:16:58.999 { 00:16:58.999 "name": "pt2", 00:16:58.999 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.999 "is_configured": true, 00:16:58.999 "data_offset": 2048, 00:16:58.999 "data_size": 63488 00:16:58.999 }, 00:16:58.999 { 00:16:58.999 "name": "pt3", 00:16:58.999 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.999 "is_configured": true, 00:16:58.999 "data_offset": 2048, 00:16:58.999 "data_size": 63488 00:16:58.999 } 00:16:58.999 ] 00:16:58.999 } 00:16:58.999 } 00:16:58.999 }' 00:16:58.999 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:58.999 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:58.999 pt2 00:16:58.999 pt3' 00:16:58.999 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:58.999 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:58.999 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:59.257 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:59.257 "name": "pt1", 00:16:59.257 "aliases": [ 00:16:59.257 "00000000-0000-0000-0000-000000000001" 00:16:59.257 ], 00:16:59.257 "product_name": "passthru", 00:16:59.257 "block_size": 512, 00:16:59.257 "num_blocks": 65536, 00:16:59.257 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.257 "assigned_rate_limits": { 00:16:59.257 "rw_ios_per_sec": 0, 00:16:59.257 "rw_mbytes_per_sec": 0, 00:16:59.257 "r_mbytes_per_sec": 0, 00:16:59.257 "w_mbytes_per_sec": 0 00:16:59.257 }, 00:16:59.257 "claimed": true, 00:16:59.257 "claim_type": "exclusive_write", 00:16:59.257 "zoned": false, 00:16:59.257 "supported_io_types": { 00:16:59.257 "read": true, 00:16:59.257 "write": true, 00:16:59.257 "unmap": true, 00:16:59.257 "flush": true, 00:16:59.257 "reset": true, 00:16:59.257 "nvme_admin": false, 00:16:59.257 "nvme_io": false, 00:16:59.257 "nvme_io_md": false, 00:16:59.257 "write_zeroes": true, 00:16:59.257 "zcopy": true, 00:16:59.257 "get_zone_info": false, 00:16:59.257 "zone_management": false, 00:16:59.257 "zone_append": false, 00:16:59.257 "compare": false, 00:16:59.257 "compare_and_write": false, 00:16:59.257 "abort": true, 00:16:59.257 "seek_hole": false, 00:16:59.257 "seek_data": false, 00:16:59.257 "copy": true, 00:16:59.257 "nvme_iov_md": false 00:16:59.257 }, 00:16:59.257 "memory_domains": [ 00:16:59.257 { 00:16:59.257 "dma_device_id": "system", 00:16:59.257 "dma_device_type": 1 00:16:59.257 }, 00:16:59.257 { 00:16:59.257 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.257 "dma_device_type": 2 00:16:59.257 } 00:16:59.257 ], 00:16:59.257 "driver_specific": { 00:16:59.257 "passthru": { 00:16:59.257 "name": "pt1", 00:16:59.257 "base_bdev_name": "malloc1" 00:16:59.257 } 00:16:59.257 } 00:16:59.257 }' 00:16:59.257 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.257 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.514 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:59.514 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.514 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.514 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:59.514 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.514 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.514 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.514 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.514 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.772 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:59.772 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:59.772 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:59.772 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:00.030 10:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:00.030 "name": "pt2", 00:17:00.030 "aliases": [ 00:17:00.030 "00000000-0000-0000-0000-000000000002" 00:17:00.030 ], 00:17:00.030 "product_name": "passthru", 00:17:00.030 "block_size": 512, 00:17:00.030 "num_blocks": 65536, 00:17:00.030 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:00.030 "assigned_rate_limits": { 00:17:00.030 "rw_ios_per_sec": 0, 00:17:00.030 "rw_mbytes_per_sec": 0, 00:17:00.030 "r_mbytes_per_sec": 0, 00:17:00.030 "w_mbytes_per_sec": 0 00:17:00.030 }, 00:17:00.030 "claimed": true, 00:17:00.030 "claim_type": "exclusive_write", 00:17:00.030 "zoned": false, 00:17:00.030 "supported_io_types": { 00:17:00.030 "read": true, 00:17:00.030 "write": true, 00:17:00.030 "unmap": true, 00:17:00.030 "flush": true, 00:17:00.030 "reset": true, 00:17:00.030 "nvme_admin": false, 00:17:00.030 "nvme_io": false, 00:17:00.030 "nvme_io_md": false, 00:17:00.030 "write_zeroes": true, 00:17:00.030 "zcopy": true, 00:17:00.030 "get_zone_info": false, 00:17:00.030 "zone_management": false, 00:17:00.030 "zone_append": false, 00:17:00.030 "compare": false, 00:17:00.030 "compare_and_write": false, 00:17:00.030 "abort": true, 00:17:00.030 "seek_hole": false, 00:17:00.030 "seek_data": false, 00:17:00.030 "copy": true, 00:17:00.030 "nvme_iov_md": false 00:17:00.030 }, 00:17:00.030 "memory_domains": [ 00:17:00.030 { 00:17:00.030 "dma_device_id": "system", 00:17:00.030 "dma_device_type": 1 00:17:00.030 }, 00:17:00.030 { 00:17:00.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.030 "dma_device_type": 2 00:17:00.030 } 00:17:00.030 ], 00:17:00.030 "driver_specific": { 00:17:00.030 "passthru": { 00:17:00.030 "name": "pt2", 00:17:00.030 "base_bdev_name": "malloc2" 00:17:00.030 } 00:17:00.030 } 00:17:00.030 }' 00:17:00.030 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.030 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.030 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:00.030 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.030 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.030 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:00.030 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.030 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.287 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.287 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.287 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.287 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.287 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:00.287 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:00.287 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:00.545 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:00.545 "name": "pt3", 00:17:00.545 "aliases": [ 00:17:00.545 "00000000-0000-0000-0000-000000000003" 00:17:00.545 ], 00:17:00.545 "product_name": "passthru", 00:17:00.545 "block_size": 512, 00:17:00.545 "num_blocks": 65536, 00:17:00.545 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:00.545 "assigned_rate_limits": { 00:17:00.545 "rw_ios_per_sec": 0, 00:17:00.545 "rw_mbytes_per_sec": 0, 00:17:00.545 "r_mbytes_per_sec": 0, 00:17:00.545 "w_mbytes_per_sec": 0 00:17:00.545 }, 00:17:00.545 "claimed": true, 00:17:00.545 "claim_type": "exclusive_write", 00:17:00.545 "zoned": false, 00:17:00.545 "supported_io_types": { 00:17:00.545 "read": true, 00:17:00.545 "write": true, 00:17:00.545 "unmap": true, 00:17:00.545 "flush": true, 00:17:00.545 "reset": true, 00:17:00.545 "nvme_admin": false, 00:17:00.545 "nvme_io": false, 00:17:00.545 "nvme_io_md": false, 00:17:00.545 "write_zeroes": true, 00:17:00.545 "zcopy": true, 00:17:00.545 "get_zone_info": false, 00:17:00.545 "zone_management": false, 00:17:00.545 "zone_append": false, 00:17:00.545 "compare": false, 00:17:00.545 "compare_and_write": false, 00:17:00.545 "abort": true, 00:17:00.545 "seek_hole": false, 00:17:00.545 "seek_data": false, 00:17:00.545 "copy": true, 00:17:00.545 "nvme_iov_md": false 00:17:00.545 }, 00:17:00.545 "memory_domains": [ 00:17:00.545 { 00:17:00.545 "dma_device_id": "system", 00:17:00.545 "dma_device_type": 1 00:17:00.545 }, 00:17:00.545 { 00:17:00.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.545 "dma_device_type": 2 00:17:00.545 } 00:17:00.545 ], 00:17:00.545 "driver_specific": { 00:17:00.545 "passthru": { 00:17:00.545 "name": "pt3", 00:17:00.545 "base_bdev_name": "malloc3" 00:17:00.545 } 00:17:00.545 } 00:17:00.545 }' 00:17:00.545 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.545 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.545 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:00.545 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.545 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.802 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:00.802 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.802 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.802 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.802 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.802 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.802 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.802 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:00.802 10:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:01.058 [2024-07-12 10:43:36.162807] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:01.058 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ac8013c0-a705-41f5-9ede-eae626c074b9 00:17:01.058 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z ac8013c0-a705-41f5-9ede-eae626c074b9 ']' 00:17:01.058 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:01.315 [2024-07-12 10:43:36.407188] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:01.315 [2024-07-12 10:43:36.407213] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:01.315 [2024-07-12 10:43:36.407266] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:01.315 [2024-07-12 10:43:36.407335] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:01.315 [2024-07-12 10:43:36.407347] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc85ea0 name raid_bdev1, state offline 00:17:01.315 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.315 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:01.572 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:01.572 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:01.572 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:01.572 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:01.829 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:01.829 10:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:02.086 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.086 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:02.344 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:02.602 [2024-07-12 10:43:37.678500] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:02.602 [2024-07-12 10:43:37.679828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:02.602 [2024-07-12 10:43:37.679871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:02.602 [2024-07-12 10:43:37.679918] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:02.602 [2024-07-12 10:43:37.679958] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:02.602 [2024-07-12 10:43:37.679980] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:02.602 [2024-07-12 10:43:37.679998] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:02.602 [2024-07-12 10:43:37.680009] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe30ff0 name raid_bdev1, state configuring 00:17:02.602 request: 00:17:02.602 { 00:17:02.602 "name": "raid_bdev1", 00:17:02.602 "raid_level": "raid1", 00:17:02.602 "base_bdevs": [ 00:17:02.602 "malloc1", 00:17:02.602 "malloc2", 00:17:02.602 "malloc3" 00:17:02.602 ], 00:17:02.602 "superblock": false, 00:17:02.602 "method": "bdev_raid_create", 00:17:02.602 "req_id": 1 00:17:02.602 } 00:17:02.602 Got JSON-RPC error response 00:17:02.602 response: 00:17:02.602 { 00:17:02.602 "code": -17, 00:17:02.602 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:02.602 } 00:17:02.602 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:02.602 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:02.602 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:02.602 10:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:02.602 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.602 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:02.860 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:02.860 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:02.860 10:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:03.117 [2024-07-12 10:43:38.103564] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:03.117 [2024-07-12 10:43:38.103603] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:03.117 [2024-07-12 10:43:38.103625] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc8d7a0 00:17:03.117 [2024-07-12 10:43:38.103637] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:03.117 [2024-07-12 10:43:38.105167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:03.117 [2024-07-12 10:43:38.105196] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:03.117 [2024-07-12 10:43:38.105260] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:03.117 [2024-07-12 10:43:38.105286] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:03.117 pt1 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.117 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:03.375 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.375 "name": "raid_bdev1", 00:17:03.375 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:03.375 "strip_size_kb": 0, 00:17:03.375 "state": "configuring", 00:17:03.375 "raid_level": "raid1", 00:17:03.375 "superblock": true, 00:17:03.375 "num_base_bdevs": 3, 00:17:03.375 "num_base_bdevs_discovered": 1, 00:17:03.375 "num_base_bdevs_operational": 3, 00:17:03.375 "base_bdevs_list": [ 00:17:03.375 { 00:17:03.375 "name": "pt1", 00:17:03.375 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:03.375 "is_configured": true, 00:17:03.375 "data_offset": 2048, 00:17:03.375 "data_size": 63488 00:17:03.375 }, 00:17:03.375 { 00:17:03.375 "name": null, 00:17:03.375 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:03.375 "is_configured": false, 00:17:03.375 "data_offset": 2048, 00:17:03.375 "data_size": 63488 00:17:03.375 }, 00:17:03.375 { 00:17:03.375 "name": null, 00:17:03.375 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:03.375 "is_configured": false, 00:17:03.375 "data_offset": 2048, 00:17:03.375 "data_size": 63488 00:17:03.375 } 00:17:03.375 ] 00:17:03.375 }' 00:17:03.375 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.375 10:43:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.941 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:03.941 10:43:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:03.941 [2024-07-12 10:43:39.114248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:03.941 [2024-07-12 10:43:39.114300] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:03.941 [2024-07-12 10:43:39.114320] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc84a10 00:17:03.941 [2024-07-12 10:43:39.114339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:03.941 [2024-07-12 10:43:39.114695] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:03.941 [2024-07-12 10:43:39.114715] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:03.941 [2024-07-12 10:43:39.114779] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:03.941 [2024-07-12 10:43:39.114798] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:03.941 pt2 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:04.199 [2024-07-12 10:43:39.290730] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.199 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:04.458 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.458 "name": "raid_bdev1", 00:17:04.458 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:04.458 "strip_size_kb": 0, 00:17:04.458 "state": "configuring", 00:17:04.458 "raid_level": "raid1", 00:17:04.458 "superblock": true, 00:17:04.458 "num_base_bdevs": 3, 00:17:04.458 "num_base_bdevs_discovered": 1, 00:17:04.458 "num_base_bdevs_operational": 3, 00:17:04.458 "base_bdevs_list": [ 00:17:04.458 { 00:17:04.458 "name": "pt1", 00:17:04.458 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:04.458 "is_configured": true, 00:17:04.458 "data_offset": 2048, 00:17:04.458 "data_size": 63488 00:17:04.458 }, 00:17:04.458 { 00:17:04.458 "name": null, 00:17:04.458 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:04.458 "is_configured": false, 00:17:04.458 "data_offset": 2048, 00:17:04.458 "data_size": 63488 00:17:04.458 }, 00:17:04.458 { 00:17:04.458 "name": null, 00:17:04.458 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:04.458 "is_configured": false, 00:17:04.458 "data_offset": 2048, 00:17:04.458 "data_size": 63488 00:17:04.458 } 00:17:04.458 ] 00:17:04.458 }' 00:17:04.458 10:43:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.458 10:43:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.023 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:05.023 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:05.023 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:05.281 [2024-07-12 10:43:40.293387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:05.281 [2024-07-12 10:43:40.293439] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.281 [2024-07-12 10:43:40.293462] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc8da10 00:17:05.281 [2024-07-12 10:43:40.293475] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.281 [2024-07-12 10:43:40.293823] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.281 [2024-07-12 10:43:40.293850] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:05.281 [2024-07-12 10:43:40.293915] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:05.281 [2024-07-12 10:43:40.293934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:05.281 pt2 00:17:05.281 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:05.281 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:05.281 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:05.281 [2024-07-12 10:43:40.469852] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:05.281 [2024-07-12 10:43:40.469888] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.281 [2024-07-12 10:43:40.469903] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc846c0 00:17:05.281 [2024-07-12 10:43:40.469916] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.281 [2024-07-12 10:43:40.470203] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.281 [2024-07-12 10:43:40.470222] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:05.281 [2024-07-12 10:43:40.470271] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:05.281 [2024-07-12 10:43:40.470288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:05.281 [2024-07-12 10:43:40.470388] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe27c00 00:17:05.281 [2024-07-12 10:43:40.470399] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:05.281 [2024-07-12 10:43:40.470571] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc87610 00:17:05.281 [2024-07-12 10:43:40.470714] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe27c00 00:17:05.281 [2024-07-12 10:43:40.470724] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe27c00 00:17:05.281 [2024-07-12 10:43:40.470821] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:05.281 pt3 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.540 "name": "raid_bdev1", 00:17:05.540 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:05.540 "strip_size_kb": 0, 00:17:05.540 "state": "online", 00:17:05.540 "raid_level": "raid1", 00:17:05.540 "superblock": true, 00:17:05.540 "num_base_bdevs": 3, 00:17:05.540 "num_base_bdevs_discovered": 3, 00:17:05.540 "num_base_bdevs_operational": 3, 00:17:05.540 "base_bdevs_list": [ 00:17:05.540 { 00:17:05.540 "name": "pt1", 00:17:05.540 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:05.540 "is_configured": true, 00:17:05.540 "data_offset": 2048, 00:17:05.540 "data_size": 63488 00:17:05.540 }, 00:17:05.540 { 00:17:05.540 "name": "pt2", 00:17:05.540 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:05.540 "is_configured": true, 00:17:05.540 "data_offset": 2048, 00:17:05.540 "data_size": 63488 00:17:05.540 }, 00:17:05.540 { 00:17:05.540 "name": "pt3", 00:17:05.540 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:05.540 "is_configured": true, 00:17:05.540 "data_offset": 2048, 00:17:05.540 "data_size": 63488 00:17:05.540 } 00:17:05.540 ] 00:17:05.540 }' 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.540 10:43:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.107 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:06.107 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:06.107 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:06.107 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:06.107 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:06.107 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:06.107 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:06.107 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:06.365 [2024-07-12 10:43:41.492840] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:06.365 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:06.365 "name": "raid_bdev1", 00:17:06.365 "aliases": [ 00:17:06.365 "ac8013c0-a705-41f5-9ede-eae626c074b9" 00:17:06.365 ], 00:17:06.365 "product_name": "Raid Volume", 00:17:06.365 "block_size": 512, 00:17:06.365 "num_blocks": 63488, 00:17:06.365 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:06.365 "assigned_rate_limits": { 00:17:06.365 "rw_ios_per_sec": 0, 00:17:06.365 "rw_mbytes_per_sec": 0, 00:17:06.365 "r_mbytes_per_sec": 0, 00:17:06.365 "w_mbytes_per_sec": 0 00:17:06.365 }, 00:17:06.365 "claimed": false, 00:17:06.365 "zoned": false, 00:17:06.365 "supported_io_types": { 00:17:06.365 "read": true, 00:17:06.365 "write": true, 00:17:06.365 "unmap": false, 00:17:06.365 "flush": false, 00:17:06.365 "reset": true, 00:17:06.365 "nvme_admin": false, 00:17:06.365 "nvme_io": false, 00:17:06.365 "nvme_io_md": false, 00:17:06.365 "write_zeroes": true, 00:17:06.365 "zcopy": false, 00:17:06.365 "get_zone_info": false, 00:17:06.365 "zone_management": false, 00:17:06.365 "zone_append": false, 00:17:06.365 "compare": false, 00:17:06.365 "compare_and_write": false, 00:17:06.365 "abort": false, 00:17:06.365 "seek_hole": false, 00:17:06.365 "seek_data": false, 00:17:06.365 "copy": false, 00:17:06.365 "nvme_iov_md": false 00:17:06.365 }, 00:17:06.365 "memory_domains": [ 00:17:06.365 { 00:17:06.365 "dma_device_id": "system", 00:17:06.365 "dma_device_type": 1 00:17:06.365 }, 00:17:06.365 { 00:17:06.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.365 "dma_device_type": 2 00:17:06.365 }, 00:17:06.365 { 00:17:06.365 "dma_device_id": "system", 00:17:06.365 "dma_device_type": 1 00:17:06.365 }, 00:17:06.365 { 00:17:06.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.365 "dma_device_type": 2 00:17:06.365 }, 00:17:06.365 { 00:17:06.365 "dma_device_id": "system", 00:17:06.365 "dma_device_type": 1 00:17:06.365 }, 00:17:06.365 { 00:17:06.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.365 "dma_device_type": 2 00:17:06.365 } 00:17:06.365 ], 00:17:06.365 "driver_specific": { 00:17:06.365 "raid": { 00:17:06.365 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:06.365 "strip_size_kb": 0, 00:17:06.365 "state": "online", 00:17:06.365 "raid_level": "raid1", 00:17:06.365 "superblock": true, 00:17:06.365 "num_base_bdevs": 3, 00:17:06.365 "num_base_bdevs_discovered": 3, 00:17:06.365 "num_base_bdevs_operational": 3, 00:17:06.365 "base_bdevs_list": [ 00:17:06.365 { 00:17:06.365 "name": "pt1", 00:17:06.365 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:06.365 "is_configured": true, 00:17:06.365 "data_offset": 2048, 00:17:06.365 "data_size": 63488 00:17:06.365 }, 00:17:06.365 { 00:17:06.365 "name": "pt2", 00:17:06.365 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:06.365 "is_configured": true, 00:17:06.365 "data_offset": 2048, 00:17:06.365 "data_size": 63488 00:17:06.365 }, 00:17:06.365 { 00:17:06.365 "name": "pt3", 00:17:06.365 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:06.365 "is_configured": true, 00:17:06.365 "data_offset": 2048, 00:17:06.365 "data_size": 63488 00:17:06.365 } 00:17:06.365 ] 00:17:06.365 } 00:17:06.365 } 00:17:06.365 }' 00:17:06.365 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:06.623 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:06.623 pt2 00:17:06.623 pt3' 00:17:06.623 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:06.623 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:06.623 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.623 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.623 "name": "pt1", 00:17:06.623 "aliases": [ 00:17:06.623 "00000000-0000-0000-0000-000000000001" 00:17:06.623 ], 00:17:06.623 "product_name": "passthru", 00:17:06.623 "block_size": 512, 00:17:06.623 "num_blocks": 65536, 00:17:06.623 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:06.623 "assigned_rate_limits": { 00:17:06.623 "rw_ios_per_sec": 0, 00:17:06.623 "rw_mbytes_per_sec": 0, 00:17:06.623 "r_mbytes_per_sec": 0, 00:17:06.623 "w_mbytes_per_sec": 0 00:17:06.623 }, 00:17:06.623 "claimed": true, 00:17:06.623 "claim_type": "exclusive_write", 00:17:06.623 "zoned": false, 00:17:06.623 "supported_io_types": { 00:17:06.623 "read": true, 00:17:06.623 "write": true, 00:17:06.623 "unmap": true, 00:17:06.623 "flush": true, 00:17:06.623 "reset": true, 00:17:06.623 "nvme_admin": false, 00:17:06.623 "nvme_io": false, 00:17:06.623 "nvme_io_md": false, 00:17:06.623 "write_zeroes": true, 00:17:06.623 "zcopy": true, 00:17:06.623 "get_zone_info": false, 00:17:06.623 "zone_management": false, 00:17:06.623 "zone_append": false, 00:17:06.623 "compare": false, 00:17:06.623 "compare_and_write": false, 00:17:06.623 "abort": true, 00:17:06.623 "seek_hole": false, 00:17:06.623 "seek_data": false, 00:17:06.623 "copy": true, 00:17:06.623 "nvme_iov_md": false 00:17:06.624 }, 00:17:06.624 "memory_domains": [ 00:17:06.624 { 00:17:06.624 "dma_device_id": "system", 00:17:06.624 "dma_device_type": 1 00:17:06.624 }, 00:17:06.624 { 00:17:06.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.624 "dma_device_type": 2 00:17:06.624 } 00:17:06.624 ], 00:17:06.624 "driver_specific": { 00:17:06.624 "passthru": { 00:17:06.624 "name": "pt1", 00:17:06.624 "base_bdev_name": "malloc1" 00:17:06.624 } 00:17:06.624 } 00:17:06.624 }' 00:17:06.624 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.882 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.882 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.882 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.882 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.882 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.882 10:43:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.882 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.882 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.882 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.141 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.141 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.141 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.141 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:07.141 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.399 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.399 "name": "pt2", 00:17:07.399 "aliases": [ 00:17:07.399 "00000000-0000-0000-0000-000000000002" 00:17:07.399 ], 00:17:07.399 "product_name": "passthru", 00:17:07.399 "block_size": 512, 00:17:07.399 "num_blocks": 65536, 00:17:07.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.399 "assigned_rate_limits": { 00:17:07.399 "rw_ios_per_sec": 0, 00:17:07.399 "rw_mbytes_per_sec": 0, 00:17:07.399 "r_mbytes_per_sec": 0, 00:17:07.399 "w_mbytes_per_sec": 0 00:17:07.399 }, 00:17:07.399 "claimed": true, 00:17:07.399 "claim_type": "exclusive_write", 00:17:07.400 "zoned": false, 00:17:07.400 "supported_io_types": { 00:17:07.400 "read": true, 00:17:07.400 "write": true, 00:17:07.400 "unmap": true, 00:17:07.400 "flush": true, 00:17:07.400 "reset": true, 00:17:07.400 "nvme_admin": false, 00:17:07.400 "nvme_io": false, 00:17:07.400 "nvme_io_md": false, 00:17:07.400 "write_zeroes": true, 00:17:07.400 "zcopy": true, 00:17:07.400 "get_zone_info": false, 00:17:07.400 "zone_management": false, 00:17:07.400 "zone_append": false, 00:17:07.400 "compare": false, 00:17:07.400 "compare_and_write": false, 00:17:07.400 "abort": true, 00:17:07.400 "seek_hole": false, 00:17:07.400 "seek_data": false, 00:17:07.400 "copy": true, 00:17:07.400 "nvme_iov_md": false 00:17:07.400 }, 00:17:07.400 "memory_domains": [ 00:17:07.400 { 00:17:07.400 "dma_device_id": "system", 00:17:07.400 "dma_device_type": 1 00:17:07.400 }, 00:17:07.400 { 00:17:07.400 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.400 "dma_device_type": 2 00:17:07.400 } 00:17:07.400 ], 00:17:07.400 "driver_specific": { 00:17:07.400 "passthru": { 00:17:07.400 "name": "pt2", 00:17:07.400 "base_bdev_name": "malloc2" 00:17:07.400 } 00:17:07.400 } 00:17:07.400 }' 00:17:07.400 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.400 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.400 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.400 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.400 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.400 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:07.400 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.658 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.658 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.658 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.658 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.658 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.658 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.658 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:07.658 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.916 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.916 "name": "pt3", 00:17:07.916 "aliases": [ 00:17:07.916 "00000000-0000-0000-0000-000000000003" 00:17:07.916 ], 00:17:07.916 "product_name": "passthru", 00:17:07.916 "block_size": 512, 00:17:07.916 "num_blocks": 65536, 00:17:07.916 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:07.916 "assigned_rate_limits": { 00:17:07.916 "rw_ios_per_sec": 0, 00:17:07.916 "rw_mbytes_per_sec": 0, 00:17:07.916 "r_mbytes_per_sec": 0, 00:17:07.916 "w_mbytes_per_sec": 0 00:17:07.916 }, 00:17:07.916 "claimed": true, 00:17:07.916 "claim_type": "exclusive_write", 00:17:07.916 "zoned": false, 00:17:07.916 "supported_io_types": { 00:17:07.916 "read": true, 00:17:07.916 "write": true, 00:17:07.916 "unmap": true, 00:17:07.916 "flush": true, 00:17:07.916 "reset": true, 00:17:07.916 "nvme_admin": false, 00:17:07.916 "nvme_io": false, 00:17:07.916 "nvme_io_md": false, 00:17:07.916 "write_zeroes": true, 00:17:07.916 "zcopy": true, 00:17:07.916 "get_zone_info": false, 00:17:07.916 "zone_management": false, 00:17:07.916 "zone_append": false, 00:17:07.916 "compare": false, 00:17:07.916 "compare_and_write": false, 00:17:07.916 "abort": true, 00:17:07.916 "seek_hole": false, 00:17:07.916 "seek_data": false, 00:17:07.916 "copy": true, 00:17:07.916 "nvme_iov_md": false 00:17:07.916 }, 00:17:07.916 "memory_domains": [ 00:17:07.916 { 00:17:07.916 "dma_device_id": "system", 00:17:07.916 "dma_device_type": 1 00:17:07.916 }, 00:17:07.916 { 00:17:07.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.916 "dma_device_type": 2 00:17:07.916 } 00:17:07.916 ], 00:17:07.916 "driver_specific": { 00:17:07.916 "passthru": { 00:17:07.916 "name": "pt3", 00:17:07.916 "base_bdev_name": "malloc3" 00:17:07.916 } 00:17:07.916 } 00:17:07.916 }' 00:17:07.916 10:43:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.916 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.916 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.916 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:08.174 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:08.432 [2024-07-12 10:43:43.562330] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:08.432 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' ac8013c0-a705-41f5-9ede-eae626c074b9 '!=' ac8013c0-a705-41f5-9ede-eae626c074b9 ']' 00:17:08.432 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:08.432 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:08.432 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:08.432 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:08.690 [2024-07-12 10:43:43.806739] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.690 10:43:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:08.948 10:43:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.948 "name": "raid_bdev1", 00:17:08.948 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:08.948 "strip_size_kb": 0, 00:17:08.948 "state": "online", 00:17:08.948 "raid_level": "raid1", 00:17:08.948 "superblock": true, 00:17:08.948 "num_base_bdevs": 3, 00:17:08.948 "num_base_bdevs_discovered": 2, 00:17:08.948 "num_base_bdevs_operational": 2, 00:17:08.948 "base_bdevs_list": [ 00:17:08.948 { 00:17:08.948 "name": null, 00:17:08.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.948 "is_configured": false, 00:17:08.948 "data_offset": 2048, 00:17:08.948 "data_size": 63488 00:17:08.948 }, 00:17:08.948 { 00:17:08.948 "name": "pt2", 00:17:08.948 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:08.948 "is_configured": true, 00:17:08.948 "data_offset": 2048, 00:17:08.948 "data_size": 63488 00:17:08.948 }, 00:17:08.948 { 00:17:08.948 "name": "pt3", 00:17:08.948 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:08.948 "is_configured": true, 00:17:08.948 "data_offset": 2048, 00:17:08.948 "data_size": 63488 00:17:08.948 } 00:17:08.948 ] 00:17:08.948 }' 00:17:08.948 10:43:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.948 10:43:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.514 10:43:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:09.771 [2024-07-12 10:43:44.897607] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:09.771 [2024-07-12 10:43:44.897636] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:09.771 [2024-07-12 10:43:44.897694] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:09.771 [2024-07-12 10:43:44.897747] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:09.771 [2024-07-12 10:43:44.897758] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe27c00 name raid_bdev1, state offline 00:17:09.771 10:43:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.771 10:43:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:10.029 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:10.029 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:10.029 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:10.029 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:10.029 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:10.287 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:10.287 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:10.287 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:10.545 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:10.545 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:10.545 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:10.545 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:10.545 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:10.803 [2024-07-12 10:43:45.884151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:10.803 [2024-07-12 10:43:45.884193] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:10.803 [2024-07-12 10:43:45.884210] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc85310 00:17:10.803 [2024-07-12 10:43:45.884222] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:10.803 [2024-07-12 10:43:45.885814] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:10.803 [2024-07-12 10:43:45.885843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:10.803 [2024-07-12 10:43:45.885906] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:10.803 [2024-07-12 10:43:45.885931] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:10.803 pt2 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.803 10:43:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:11.061 10:43:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.061 "name": "raid_bdev1", 00:17:11.061 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:11.061 "strip_size_kb": 0, 00:17:11.061 "state": "configuring", 00:17:11.061 "raid_level": "raid1", 00:17:11.061 "superblock": true, 00:17:11.061 "num_base_bdevs": 3, 00:17:11.061 "num_base_bdevs_discovered": 1, 00:17:11.061 "num_base_bdevs_operational": 2, 00:17:11.061 "base_bdevs_list": [ 00:17:11.061 { 00:17:11.061 "name": null, 00:17:11.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.061 "is_configured": false, 00:17:11.061 "data_offset": 2048, 00:17:11.061 "data_size": 63488 00:17:11.061 }, 00:17:11.061 { 00:17:11.061 "name": "pt2", 00:17:11.061 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:11.061 "is_configured": true, 00:17:11.061 "data_offset": 2048, 00:17:11.061 "data_size": 63488 00:17:11.061 }, 00:17:11.061 { 00:17:11.061 "name": null, 00:17:11.061 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:11.061 "is_configured": false, 00:17:11.061 "data_offset": 2048, 00:17:11.061 "data_size": 63488 00:17:11.061 } 00:17:11.061 ] 00:17:11.061 }' 00:17:11.061 10:43:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.061 10:43:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:11.628 10:43:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:11.628 10:43:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:11.628 10:43:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:17:11.628 10:43:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:11.886 [2024-07-12 10:43:46.983075] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:11.886 [2024-07-12 10:43:46.983129] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:11.886 [2024-07-12 10:43:46.983150] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc83ec0 00:17:11.886 [2024-07-12 10:43:46.983163] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:11.886 [2024-07-12 10:43:46.983527] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:11.886 [2024-07-12 10:43:46.983547] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:11.886 [2024-07-12 10:43:46.983613] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:11.886 [2024-07-12 10:43:46.983632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:11.886 [2024-07-12 10:43:46.983734] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe25cc0 00:17:11.886 [2024-07-12 10:43:46.983745] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:11.886 [2024-07-12 10:43:46.983911] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe266d0 00:17:11.886 [2024-07-12 10:43:46.984040] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe25cc0 00:17:11.886 [2024-07-12 10:43:46.984050] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe25cc0 00:17:11.886 [2024-07-12 10:43:46.984148] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:11.886 pt3 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.886 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:12.182 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.182 "name": "raid_bdev1", 00:17:12.182 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:12.182 "strip_size_kb": 0, 00:17:12.182 "state": "online", 00:17:12.182 "raid_level": "raid1", 00:17:12.182 "superblock": true, 00:17:12.182 "num_base_bdevs": 3, 00:17:12.182 "num_base_bdevs_discovered": 2, 00:17:12.182 "num_base_bdevs_operational": 2, 00:17:12.182 "base_bdevs_list": [ 00:17:12.182 { 00:17:12.182 "name": null, 00:17:12.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.182 "is_configured": false, 00:17:12.182 "data_offset": 2048, 00:17:12.182 "data_size": 63488 00:17:12.182 }, 00:17:12.182 { 00:17:12.182 "name": "pt2", 00:17:12.182 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:12.182 "is_configured": true, 00:17:12.182 "data_offset": 2048, 00:17:12.182 "data_size": 63488 00:17:12.182 }, 00:17:12.182 { 00:17:12.182 "name": "pt3", 00:17:12.182 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:12.182 "is_configured": true, 00:17:12.182 "data_offset": 2048, 00:17:12.182 "data_size": 63488 00:17:12.182 } 00:17:12.182 ] 00:17:12.182 }' 00:17:12.182 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.182 10:43:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.771 10:43:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:13.029 [2024-07-12 10:43:48.081983] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:13.029 [2024-07-12 10:43:48.082014] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:13.029 [2024-07-12 10:43:48.082069] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:13.029 [2024-07-12 10:43:48.082121] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:13.029 [2024-07-12 10:43:48.082133] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe25cc0 name raid_bdev1, state offline 00:17:13.029 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.029 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:13.287 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:13.287 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:13.287 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:17:13.287 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:17:13.287 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:13.545 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:13.803 [2024-07-12 10:43:48.811877] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:13.803 [2024-07-12 10:43:48.811923] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:13.803 [2024-07-12 10:43:48.811941] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc83ec0 00:17:13.803 [2024-07-12 10:43:48.811953] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:13.803 [2024-07-12 10:43:48.813558] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:13.803 [2024-07-12 10:43:48.813589] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:13.803 [2024-07-12 10:43:48.813657] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:13.803 [2024-07-12 10:43:48.813689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:13.803 [2024-07-12 10:43:48.813785] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:13.803 [2024-07-12 10:43:48.813798] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:13.803 [2024-07-12 10:43:48.813811] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe25f40 name raid_bdev1, state configuring 00:17:13.803 [2024-07-12 10:43:48.813834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:13.803 pt1 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.803 10:43:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:14.061 10:43:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.061 "name": "raid_bdev1", 00:17:14.061 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:14.061 "strip_size_kb": 0, 00:17:14.061 "state": "configuring", 00:17:14.061 "raid_level": "raid1", 00:17:14.061 "superblock": true, 00:17:14.061 "num_base_bdevs": 3, 00:17:14.061 "num_base_bdevs_discovered": 1, 00:17:14.061 "num_base_bdevs_operational": 2, 00:17:14.061 "base_bdevs_list": [ 00:17:14.061 { 00:17:14.061 "name": null, 00:17:14.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:14.061 "is_configured": false, 00:17:14.061 "data_offset": 2048, 00:17:14.061 "data_size": 63488 00:17:14.061 }, 00:17:14.061 { 00:17:14.061 "name": "pt2", 00:17:14.061 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:14.061 "is_configured": true, 00:17:14.061 "data_offset": 2048, 00:17:14.061 "data_size": 63488 00:17:14.061 }, 00:17:14.061 { 00:17:14.061 "name": null, 00:17:14.061 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:14.061 "is_configured": false, 00:17:14.061 "data_offset": 2048, 00:17:14.061 "data_size": 63488 00:17:14.061 } 00:17:14.061 ] 00:17:14.061 }' 00:17:14.061 10:43:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.061 10:43:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:14.627 10:43:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:14.627 10:43:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:14.885 10:43:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:14.885 10:43:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:14.885 [2024-07-12 10:43:50.075222] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:14.885 [2024-07-12 10:43:50.075277] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:14.885 [2024-07-12 10:43:50.075297] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc870c0 00:17:14.885 [2024-07-12 10:43:50.075310] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:14.885 [2024-07-12 10:43:50.075686] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:14.885 [2024-07-12 10:43:50.075706] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:14.885 [2024-07-12 10:43:50.075772] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:14.885 [2024-07-12 10:43:50.075793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:14.885 [2024-07-12 10:43:50.075896] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc87a40 00:17:14.885 [2024-07-12 10:43:50.075907] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:14.885 [2024-07-12 10:43:50.076083] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe266c0 00:17:14.885 [2024-07-12 10:43:50.076211] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc87a40 00:17:14.885 [2024-07-12 10:43:50.076222] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc87a40 00:17:14.885 [2024-07-12 10:43:50.076316] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:15.144 pt3 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:15.144 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.402 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.402 "name": "raid_bdev1", 00:17:15.402 "uuid": "ac8013c0-a705-41f5-9ede-eae626c074b9", 00:17:15.402 "strip_size_kb": 0, 00:17:15.402 "state": "online", 00:17:15.402 "raid_level": "raid1", 00:17:15.402 "superblock": true, 00:17:15.402 "num_base_bdevs": 3, 00:17:15.402 "num_base_bdevs_discovered": 2, 00:17:15.402 "num_base_bdevs_operational": 2, 00:17:15.402 "base_bdevs_list": [ 00:17:15.402 { 00:17:15.402 "name": null, 00:17:15.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.402 "is_configured": false, 00:17:15.402 "data_offset": 2048, 00:17:15.402 "data_size": 63488 00:17:15.402 }, 00:17:15.402 { 00:17:15.402 "name": "pt2", 00:17:15.402 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:15.402 "is_configured": true, 00:17:15.402 "data_offset": 2048, 00:17:15.402 "data_size": 63488 00:17:15.402 }, 00:17:15.402 { 00:17:15.402 "name": "pt3", 00:17:15.402 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:15.402 "is_configured": true, 00:17:15.402 "data_offset": 2048, 00:17:15.402 "data_size": 63488 00:17:15.402 } 00:17:15.402 ] 00:17:15.402 }' 00:17:15.402 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.402 10:43:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.968 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:15.968 10:43:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:16.227 10:43:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:16.227 10:43:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:16.227 10:43:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:16.227 [2024-07-12 10:43:51.419026] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' ac8013c0-a705-41f5-9ede-eae626c074b9 '!=' ac8013c0-a705-41f5-9ede-eae626c074b9 ']' 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2070574 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2070574 ']' 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2070574 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2070574 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2070574' 00:17:16.485 killing process with pid 2070574 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2070574 00:17:16.485 [2024-07-12 10:43:51.484736] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:16.485 [2024-07-12 10:43:51.484789] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:16.485 [2024-07-12 10:43:51.484842] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:16.485 [2024-07-12 10:43:51.484853] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc87a40 name raid_bdev1, state offline 00:17:16.485 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2070574 00:17:16.485 [2024-07-12 10:43:51.512098] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:16.744 10:43:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:16.744 00:17:16.744 real 0m21.376s 00:17:16.744 user 0m38.971s 00:17:16.744 sys 0m3.975s 00:17:16.744 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:16.744 10:43:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.744 ************************************ 00:17:16.744 END TEST raid_superblock_test 00:17:16.744 ************************************ 00:17:16.744 10:43:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:16.744 10:43:51 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:17:16.744 10:43:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:16.744 10:43:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:16.744 10:43:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:16.744 ************************************ 00:17:16.744 START TEST raid_read_error_test 00:17:16.744 ************************************ 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.aUoi3wXhRV 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2073801 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2073801 /var/tmp/spdk-raid.sock 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2073801 ']' 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:16.744 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:16.744 10:43:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.744 [2024-07-12 10:43:51.869267] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:17:16.744 [2024-07-12 10:43:51.869337] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2073801 ] 00:17:17.003 [2024-07-12 10:43:51.986929] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:17.003 [2024-07-12 10:43:52.090124] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:17.003 [2024-07-12 10:43:52.145231] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:17.003 [2024-07-12 10:43:52.145259] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:17.938 10:43:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:17.938 10:43:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:17.938 10:43:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:17.938 10:43:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:17.938 BaseBdev1_malloc 00:17:17.938 10:43:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:18.196 true 00:17:18.196 10:43:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:18.454 [2024-07-12 10:43:53.529291] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:18.454 [2024-07-12 10:43:53.529336] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.454 [2024-07-12 10:43:53.529358] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf590d0 00:17:18.454 [2024-07-12 10:43:53.529370] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.454 [2024-07-12 10:43:53.531217] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.454 [2024-07-12 10:43:53.531248] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:18.454 BaseBdev1 00:17:18.454 10:43:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:18.454 10:43:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:18.712 BaseBdev2_malloc 00:17:18.712 10:43:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:18.969 true 00:17:18.969 10:43:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:19.227 [2024-07-12 10:43:54.263834] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:19.227 [2024-07-12 10:43:54.263882] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:19.227 [2024-07-12 10:43:54.263905] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf5d910 00:17:19.227 [2024-07-12 10:43:54.263918] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:19.227 [2024-07-12 10:43:54.265531] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:19.227 [2024-07-12 10:43:54.265560] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:19.227 BaseBdev2 00:17:19.227 10:43:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:19.227 10:43:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:19.485 BaseBdev3_malloc 00:17:19.485 10:43:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:19.743 true 00:17:19.743 10:43:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:20.001 [2024-07-12 10:43:54.939437] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:20.001 [2024-07-12 10:43:54.939493] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:20.001 [2024-07-12 10:43:54.939515] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf5fbd0 00:17:20.001 [2024-07-12 10:43:54.939528] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:20.001 [2024-07-12 10:43:54.940997] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:20.001 [2024-07-12 10:43:54.941027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:20.001 BaseBdev3 00:17:20.001 10:43:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:20.001 [2024-07-12 10:43:55.192131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:20.001 [2024-07-12 10:43:55.193415] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:20.001 [2024-07-12 10:43:55.193492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:20.001 [2024-07-12 10:43:55.193722] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf61280 00:17:20.001 [2024-07-12 10:43:55.193735] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:20.001 [2024-07-12 10:43:55.193939] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf60e20 00:17:20.001 [2024-07-12 10:43:55.194102] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf61280 00:17:20.001 [2024-07-12 10:43:55.194115] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf61280 00:17:20.001 [2024-07-12 10:43:55.194224] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.259 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:20.517 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.517 "name": "raid_bdev1", 00:17:20.517 "uuid": "0ba3f906-4eec-4a0f-9687-e0f76a8f8cbb", 00:17:20.517 "strip_size_kb": 0, 00:17:20.517 "state": "online", 00:17:20.517 "raid_level": "raid1", 00:17:20.517 "superblock": true, 00:17:20.517 "num_base_bdevs": 3, 00:17:20.517 "num_base_bdevs_discovered": 3, 00:17:20.517 "num_base_bdevs_operational": 3, 00:17:20.517 "base_bdevs_list": [ 00:17:20.517 { 00:17:20.517 "name": "BaseBdev1", 00:17:20.517 "uuid": "0cef5d6c-feed-5fde-b459-05310df2f433", 00:17:20.517 "is_configured": true, 00:17:20.517 "data_offset": 2048, 00:17:20.517 "data_size": 63488 00:17:20.517 }, 00:17:20.517 { 00:17:20.517 "name": "BaseBdev2", 00:17:20.517 "uuid": "ac0cd1ba-8687-51fe-ab71-2fdeb12bf05d", 00:17:20.517 "is_configured": true, 00:17:20.517 "data_offset": 2048, 00:17:20.517 "data_size": 63488 00:17:20.517 }, 00:17:20.517 { 00:17:20.517 "name": "BaseBdev3", 00:17:20.517 "uuid": "2800b9ca-f161-5b8a-9b4c-9e770432e77c", 00:17:20.517 "is_configured": true, 00:17:20.517 "data_offset": 2048, 00:17:20.517 "data_size": 63488 00:17:20.517 } 00:17:20.517 ] 00:17:20.517 }' 00:17:20.517 10:43:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.517 10:43:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:21.451 10:43:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:21.451 10:43:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:21.451 [2024-07-12 10:43:56.427704] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdaee00 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.387 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:22.645 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.645 "name": "raid_bdev1", 00:17:22.645 "uuid": "0ba3f906-4eec-4a0f-9687-e0f76a8f8cbb", 00:17:22.645 "strip_size_kb": 0, 00:17:22.645 "state": "online", 00:17:22.645 "raid_level": "raid1", 00:17:22.645 "superblock": true, 00:17:22.645 "num_base_bdevs": 3, 00:17:22.645 "num_base_bdevs_discovered": 3, 00:17:22.645 "num_base_bdevs_operational": 3, 00:17:22.645 "base_bdevs_list": [ 00:17:22.645 { 00:17:22.645 "name": "BaseBdev1", 00:17:22.645 "uuid": "0cef5d6c-feed-5fde-b459-05310df2f433", 00:17:22.645 "is_configured": true, 00:17:22.645 "data_offset": 2048, 00:17:22.645 "data_size": 63488 00:17:22.645 }, 00:17:22.645 { 00:17:22.645 "name": "BaseBdev2", 00:17:22.645 "uuid": "ac0cd1ba-8687-51fe-ab71-2fdeb12bf05d", 00:17:22.645 "is_configured": true, 00:17:22.645 "data_offset": 2048, 00:17:22.645 "data_size": 63488 00:17:22.645 }, 00:17:22.645 { 00:17:22.645 "name": "BaseBdev3", 00:17:22.645 "uuid": "2800b9ca-f161-5b8a-9b4c-9e770432e77c", 00:17:22.645 "is_configured": true, 00:17:22.645 "data_offset": 2048, 00:17:22.645 "data_size": 63488 00:17:22.645 } 00:17:22.645 ] 00:17:22.645 }' 00:17:22.645 10:43:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.645 10:43:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:23.577 [2024-07-12 10:43:58.656620] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:23.577 [2024-07-12 10:43:58.656657] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:23.577 [2024-07-12 10:43:58.659882] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:23.577 [2024-07-12 10:43:58.659917] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:23.577 [2024-07-12 10:43:58.660016] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:23.577 [2024-07-12 10:43:58.660027] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf61280 name raid_bdev1, state offline 00:17:23.577 0 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2073801 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2073801 ']' 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2073801 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2073801 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2073801' 00:17:23.577 killing process with pid 2073801 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2073801 00:17:23.577 [2024-07-12 10:43:58.721718] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:23.577 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2073801 00:17:23.577 [2024-07-12 10:43:58.742901] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.aUoi3wXhRV 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:23.835 00:17:23.835 real 0m7.179s 00:17:23.835 user 0m11.464s 00:17:23.835 sys 0m1.236s 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:23.835 10:43:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.835 ************************************ 00:17:23.835 END TEST raid_read_error_test 00:17:23.835 ************************************ 00:17:23.835 10:43:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:23.835 10:43:59 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:17:23.835 10:43:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:23.835 10:43:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:23.835 10:43:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:24.093 ************************************ 00:17:24.093 START TEST raid_write_error_test 00:17:24.093 ************************************ 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.SzWuw4uhXY 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2074818 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2074818 /var/tmp/spdk-raid.sock 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2074818 ']' 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:24.093 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:24.093 10:43:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.093 [2024-07-12 10:43:59.145553] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:17:24.093 [2024-07-12 10:43:59.145627] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2074818 ] 00:17:24.093 [2024-07-12 10:43:59.273653] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:24.351 [2024-07-12 10:43:59.377288] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:24.351 [2024-07-12 10:43:59.437791] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:24.351 [2024-07-12 10:43:59.437828] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:24.915 10:44:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:24.915 10:44:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:24.915 10:44:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:24.915 10:44:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:25.171 BaseBdev1_malloc 00:17:25.171 10:44:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:25.428 true 00:17:25.428 10:44:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:25.428 [2024-07-12 10:44:00.587247] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:25.428 [2024-07-12 10:44:00.587297] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:25.428 [2024-07-12 10:44:00.587317] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16110d0 00:17:25.428 [2024-07-12 10:44:00.587330] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:25.428 [2024-07-12 10:44:00.589114] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:25.428 [2024-07-12 10:44:00.589146] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:25.428 BaseBdev1 00:17:25.428 10:44:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:25.428 10:44:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:25.685 BaseBdev2_malloc 00:17:25.685 10:44:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:25.942 true 00:17:25.942 10:44:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:25.942 [2024-07-12 10:44:01.089230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:25.942 [2024-07-12 10:44:01.089281] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:25.942 [2024-07-12 10:44:01.089301] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1615910 00:17:25.942 [2024-07-12 10:44:01.089313] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:25.942 [2024-07-12 10:44:01.090767] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:25.942 [2024-07-12 10:44:01.090797] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:25.942 BaseBdev2 00:17:25.942 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:25.942 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:26.199 BaseBdev3_malloc 00:17:26.199 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:26.456 true 00:17:26.456 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:26.711 [2024-07-12 10:44:01.763699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:26.711 [2024-07-12 10:44:01.763744] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:26.711 [2024-07-12 10:44:01.763763] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1617bd0 00:17:26.711 [2024-07-12 10:44:01.763776] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:26.711 [2024-07-12 10:44:01.765175] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:26.711 [2024-07-12 10:44:01.765204] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:26.711 BaseBdev3 00:17:26.711 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:26.968 [2024-07-12 10:44:01.944204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:26.968 [2024-07-12 10:44:01.945383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:26.968 [2024-07-12 10:44:01.945451] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:26.968 [2024-07-12 10:44:01.945665] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1619280 00:17:26.968 [2024-07-12 10:44:01.945677] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:26.968 [2024-07-12 10:44:01.945854] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1618e20 00:17:26.968 [2024-07-12 10:44:01.945999] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1619280 00:17:26.968 [2024-07-12 10:44:01.946009] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1619280 00:17:26.968 [2024-07-12 10:44:01.946106] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.968 10:44:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:26.968 10:44:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.968 "name": "raid_bdev1", 00:17:26.968 "uuid": "de47174d-76d1-4842-bb4f-2a00c49661a6", 00:17:26.968 "strip_size_kb": 0, 00:17:26.968 "state": "online", 00:17:26.968 "raid_level": "raid1", 00:17:26.968 "superblock": true, 00:17:26.968 "num_base_bdevs": 3, 00:17:26.968 "num_base_bdevs_discovered": 3, 00:17:26.968 "num_base_bdevs_operational": 3, 00:17:26.968 "base_bdevs_list": [ 00:17:26.968 { 00:17:26.968 "name": "BaseBdev1", 00:17:26.968 "uuid": "d644338c-80ae-58e8-bc08-d141c1fc8ac8", 00:17:26.968 "is_configured": true, 00:17:26.968 "data_offset": 2048, 00:17:26.968 "data_size": 63488 00:17:26.968 }, 00:17:26.968 { 00:17:26.968 "name": "BaseBdev2", 00:17:26.968 "uuid": "4ae3a36b-cf68-5ae2-8575-93e3bee00ee6", 00:17:26.968 "is_configured": true, 00:17:26.968 "data_offset": 2048, 00:17:26.968 "data_size": 63488 00:17:26.968 }, 00:17:26.968 { 00:17:26.968 "name": "BaseBdev3", 00:17:26.968 "uuid": "71ea472a-d4a0-5072-8b08-15435d100ed0", 00:17:26.968 "is_configured": true, 00:17:26.968 "data_offset": 2048, 00:17:26.968 "data_size": 63488 00:17:26.968 } 00:17:26.968 ] 00:17:26.968 }' 00:17:26.968 10:44:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.968 10:44:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.897 10:44:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:27.897 10:44:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:27.897 [2024-07-12 10:44:02.862939] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1466e00 00:17:28.826 10:44:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:28.826 [2024-07-12 10:44:03.990556] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:28.826 [2024-07-12 10:44:03.990615] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:28.826 [2024-07-12 10:44:03.990812] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1466e00 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.826 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:29.082 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:29.082 "name": "raid_bdev1", 00:17:29.082 "uuid": "de47174d-76d1-4842-bb4f-2a00c49661a6", 00:17:29.083 "strip_size_kb": 0, 00:17:29.083 "state": "online", 00:17:29.083 "raid_level": "raid1", 00:17:29.083 "superblock": true, 00:17:29.083 "num_base_bdevs": 3, 00:17:29.083 "num_base_bdevs_discovered": 2, 00:17:29.083 "num_base_bdevs_operational": 2, 00:17:29.083 "base_bdevs_list": [ 00:17:29.083 { 00:17:29.083 "name": null, 00:17:29.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:29.083 "is_configured": false, 00:17:29.083 "data_offset": 2048, 00:17:29.083 "data_size": 63488 00:17:29.083 }, 00:17:29.083 { 00:17:29.083 "name": "BaseBdev2", 00:17:29.083 "uuid": "4ae3a36b-cf68-5ae2-8575-93e3bee00ee6", 00:17:29.083 "is_configured": true, 00:17:29.083 "data_offset": 2048, 00:17:29.083 "data_size": 63488 00:17:29.083 }, 00:17:29.083 { 00:17:29.083 "name": "BaseBdev3", 00:17:29.083 "uuid": "71ea472a-d4a0-5072-8b08-15435d100ed0", 00:17:29.083 "is_configured": true, 00:17:29.083 "data_offset": 2048, 00:17:29.083 "data_size": 63488 00:17:29.083 } 00:17:29.083 ] 00:17:29.083 }' 00:17:29.083 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:29.083 10:44:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.712 10:44:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:29.971 [2024-07-12 10:44:05.077239] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:29.971 [2024-07-12 10:44:05.077286] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:29.971 [2024-07-12 10:44:05.080419] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:29.971 [2024-07-12 10:44:05.080453] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:29.971 [2024-07-12 10:44:05.080536] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:29.971 [2024-07-12 10:44:05.080548] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1619280 name raid_bdev1, state offline 00:17:29.971 0 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2074818 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2074818 ']' 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2074818 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2074818 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2074818' 00:17:29.971 killing process with pid 2074818 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2074818 00:17:29.971 [2024-07-12 10:44:05.144613] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:29.971 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2074818 00:17:30.228 [2024-07-12 10:44:05.165827] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:30.228 10:44:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.SzWuw4uhXY 00:17:30.228 10:44:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:30.228 10:44:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:30.229 10:44:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:30.229 10:44:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:30.229 10:44:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:30.229 10:44:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:30.229 10:44:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:30.229 00:17:30.229 real 0m6.336s 00:17:30.229 user 0m9.857s 00:17:30.229 sys 0m1.147s 00:17:30.229 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:30.229 10:44:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.229 ************************************ 00:17:30.229 END TEST raid_write_error_test 00:17:30.229 ************************************ 00:17:30.487 10:44:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:30.487 10:44:05 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:17:30.487 10:44:05 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:30.487 10:44:05 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:17:30.487 10:44:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:30.487 10:44:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:30.487 10:44:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:30.487 ************************************ 00:17:30.487 START TEST raid_state_function_test 00:17:30.487 ************************************ 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2075793 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2075793' 00:17:30.487 Process raid pid: 2075793 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2075793 /var/tmp/spdk-raid.sock 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2075793 ']' 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:30.487 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:30.487 10:44:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.487 [2024-07-12 10:44:05.558233] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:17:30.487 [2024-07-12 10:44:05.558286] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:30.487 [2024-07-12 10:44:05.670552] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:30.744 [2024-07-12 10:44:05.772628] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:30.744 [2024-07-12 10:44:05.831853] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:30.744 [2024-07-12 10:44:05.831884] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:31.308 10:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:31.308 10:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:31.308 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:31.566 [2024-07-12 10:44:06.657221] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:31.566 [2024-07-12 10:44:06.657265] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:31.566 [2024-07-12 10:44:06.657276] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:31.566 [2024-07-12 10:44:06.657288] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:31.566 [2024-07-12 10:44:06.657297] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:31.566 [2024-07-12 10:44:06.657308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:31.566 [2024-07-12 10:44:06.657317] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:31.566 [2024-07-12 10:44:06.657328] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.566 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:31.824 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.824 "name": "Existed_Raid", 00:17:31.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.824 "strip_size_kb": 64, 00:17:31.824 "state": "configuring", 00:17:31.824 "raid_level": "raid0", 00:17:31.824 "superblock": false, 00:17:31.824 "num_base_bdevs": 4, 00:17:31.824 "num_base_bdevs_discovered": 0, 00:17:31.824 "num_base_bdevs_operational": 4, 00:17:31.824 "base_bdevs_list": [ 00:17:31.824 { 00:17:31.824 "name": "BaseBdev1", 00:17:31.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.824 "is_configured": false, 00:17:31.824 "data_offset": 0, 00:17:31.824 "data_size": 0 00:17:31.824 }, 00:17:31.824 { 00:17:31.824 "name": "BaseBdev2", 00:17:31.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.824 "is_configured": false, 00:17:31.824 "data_offset": 0, 00:17:31.824 "data_size": 0 00:17:31.824 }, 00:17:31.824 { 00:17:31.824 "name": "BaseBdev3", 00:17:31.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.824 "is_configured": false, 00:17:31.824 "data_offset": 0, 00:17:31.824 "data_size": 0 00:17:31.824 }, 00:17:31.824 { 00:17:31.824 "name": "BaseBdev4", 00:17:31.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.824 "is_configured": false, 00:17:31.824 "data_offset": 0, 00:17:31.824 "data_size": 0 00:17:31.824 } 00:17:31.824 ] 00:17:31.824 }' 00:17:31.824 10:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.824 10:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:32.432 10:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:32.690 [2024-07-12 10:44:07.659743] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:32.690 [2024-07-12 10:44:07.659775] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x248eaa0 name Existed_Raid, state configuring 00:17:32.690 10:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:32.946 [2024-07-12 10:44:07.900388] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:32.946 [2024-07-12 10:44:07.900420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:32.946 [2024-07-12 10:44:07.900430] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:32.946 [2024-07-12 10:44:07.900441] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:32.946 [2024-07-12 10:44:07.900450] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:32.946 [2024-07-12 10:44:07.900461] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:32.946 [2024-07-12 10:44:07.900470] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:32.946 [2024-07-12 10:44:07.900488] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:32.946 10:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:33.203 [2024-07-12 10:44:08.150892] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:33.203 BaseBdev1 00:17:33.203 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:33.203 10:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:33.203 10:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:33.203 10:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:33.203 10:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:33.203 10:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:33.203 10:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:33.461 10:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:33.461 [ 00:17:33.461 { 00:17:33.461 "name": "BaseBdev1", 00:17:33.461 "aliases": [ 00:17:33.461 "fc47addd-d736-464c-9588-6fb00c741fbe" 00:17:33.461 ], 00:17:33.461 "product_name": "Malloc disk", 00:17:33.461 "block_size": 512, 00:17:33.461 "num_blocks": 65536, 00:17:33.461 "uuid": "fc47addd-d736-464c-9588-6fb00c741fbe", 00:17:33.461 "assigned_rate_limits": { 00:17:33.461 "rw_ios_per_sec": 0, 00:17:33.461 "rw_mbytes_per_sec": 0, 00:17:33.461 "r_mbytes_per_sec": 0, 00:17:33.461 "w_mbytes_per_sec": 0 00:17:33.461 }, 00:17:33.461 "claimed": true, 00:17:33.461 "claim_type": "exclusive_write", 00:17:33.461 "zoned": false, 00:17:33.461 "supported_io_types": { 00:17:33.461 "read": true, 00:17:33.461 "write": true, 00:17:33.461 "unmap": true, 00:17:33.461 "flush": true, 00:17:33.461 "reset": true, 00:17:33.461 "nvme_admin": false, 00:17:33.461 "nvme_io": false, 00:17:33.461 "nvme_io_md": false, 00:17:33.461 "write_zeroes": true, 00:17:33.461 "zcopy": true, 00:17:33.461 "get_zone_info": false, 00:17:33.461 "zone_management": false, 00:17:33.461 "zone_append": false, 00:17:33.461 "compare": false, 00:17:33.461 "compare_and_write": false, 00:17:33.461 "abort": true, 00:17:33.461 "seek_hole": false, 00:17:33.461 "seek_data": false, 00:17:33.461 "copy": true, 00:17:33.461 "nvme_iov_md": false 00:17:33.461 }, 00:17:33.461 "memory_domains": [ 00:17:33.461 { 00:17:33.461 "dma_device_id": "system", 00:17:33.461 "dma_device_type": 1 00:17:33.461 }, 00:17:33.461 { 00:17:33.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.461 "dma_device_type": 2 00:17:33.461 } 00:17:33.461 ], 00:17:33.461 "driver_specific": {} 00:17:33.461 } 00:17:33.461 ] 00:17:33.461 10:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:33.461 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:33.461 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.461 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.461 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:33.461 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:33.461 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:33.719 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.719 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.719 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.719 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.719 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.719 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.719 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.719 "name": "Existed_Raid", 00:17:33.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.719 "strip_size_kb": 64, 00:17:33.719 "state": "configuring", 00:17:33.719 "raid_level": "raid0", 00:17:33.719 "superblock": false, 00:17:33.719 "num_base_bdevs": 4, 00:17:33.719 "num_base_bdevs_discovered": 1, 00:17:33.719 "num_base_bdevs_operational": 4, 00:17:33.719 "base_bdevs_list": [ 00:17:33.719 { 00:17:33.719 "name": "BaseBdev1", 00:17:33.719 "uuid": "fc47addd-d736-464c-9588-6fb00c741fbe", 00:17:33.719 "is_configured": true, 00:17:33.719 "data_offset": 0, 00:17:33.719 "data_size": 65536 00:17:33.719 }, 00:17:33.719 { 00:17:33.719 "name": "BaseBdev2", 00:17:33.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.719 "is_configured": false, 00:17:33.719 "data_offset": 0, 00:17:33.719 "data_size": 0 00:17:33.719 }, 00:17:33.719 { 00:17:33.719 "name": "BaseBdev3", 00:17:33.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.719 "is_configured": false, 00:17:33.719 "data_offset": 0, 00:17:33.720 "data_size": 0 00:17:33.720 }, 00:17:33.720 { 00:17:33.720 "name": "BaseBdev4", 00:17:33.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.720 "is_configured": false, 00:17:33.720 "data_offset": 0, 00:17:33.720 "data_size": 0 00:17:33.720 } 00:17:33.720 ] 00:17:33.720 }' 00:17:33.720 10:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.720 10:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.655 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:34.655 [2024-07-12 10:44:09.739254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:34.655 [2024-07-12 10:44:09.739295] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x248e310 name Existed_Raid, state configuring 00:17:34.655 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:34.913 [2024-07-12 10:44:09.915756] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:34.913 [2024-07-12 10:44:09.917194] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:34.913 [2024-07-12 10:44:09.917229] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:34.913 [2024-07-12 10:44:09.917241] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:34.913 [2024-07-12 10:44:09.917253] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:34.913 [2024-07-12 10:44:09.917262] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:34.913 [2024-07-12 10:44:09.917273] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.913 10:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.173 10:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.173 "name": "Existed_Raid", 00:17:35.173 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.173 "strip_size_kb": 64, 00:17:35.173 "state": "configuring", 00:17:35.173 "raid_level": "raid0", 00:17:35.173 "superblock": false, 00:17:35.173 "num_base_bdevs": 4, 00:17:35.173 "num_base_bdevs_discovered": 1, 00:17:35.173 "num_base_bdevs_operational": 4, 00:17:35.173 "base_bdevs_list": [ 00:17:35.173 { 00:17:35.173 "name": "BaseBdev1", 00:17:35.173 "uuid": "fc47addd-d736-464c-9588-6fb00c741fbe", 00:17:35.173 "is_configured": true, 00:17:35.173 "data_offset": 0, 00:17:35.173 "data_size": 65536 00:17:35.173 }, 00:17:35.173 { 00:17:35.173 "name": "BaseBdev2", 00:17:35.173 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.173 "is_configured": false, 00:17:35.173 "data_offset": 0, 00:17:35.173 "data_size": 0 00:17:35.173 }, 00:17:35.173 { 00:17:35.173 "name": "BaseBdev3", 00:17:35.173 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.173 "is_configured": false, 00:17:35.173 "data_offset": 0, 00:17:35.173 "data_size": 0 00:17:35.173 }, 00:17:35.173 { 00:17:35.173 "name": "BaseBdev4", 00:17:35.173 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.173 "is_configured": false, 00:17:35.173 "data_offset": 0, 00:17:35.173 "data_size": 0 00:17:35.173 } 00:17:35.173 ] 00:17:35.173 }' 00:17:35.173 10:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.173 10:44:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.741 10:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:35.741 [2024-07-12 10:44:10.934031] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:35.741 BaseBdev2 00:17:36.000 10:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:36.000 10:44:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:36.000 10:44:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:36.000 10:44:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:36.000 10:44:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:36.000 10:44:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:36.000 10:44:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:36.000 10:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:36.259 [ 00:17:36.259 { 00:17:36.259 "name": "BaseBdev2", 00:17:36.259 "aliases": [ 00:17:36.259 "d403994a-3b15-4a68-b8e7-9722507b03c6" 00:17:36.259 ], 00:17:36.259 "product_name": "Malloc disk", 00:17:36.259 "block_size": 512, 00:17:36.259 "num_blocks": 65536, 00:17:36.259 "uuid": "d403994a-3b15-4a68-b8e7-9722507b03c6", 00:17:36.259 "assigned_rate_limits": { 00:17:36.259 "rw_ios_per_sec": 0, 00:17:36.259 "rw_mbytes_per_sec": 0, 00:17:36.259 "r_mbytes_per_sec": 0, 00:17:36.259 "w_mbytes_per_sec": 0 00:17:36.259 }, 00:17:36.259 "claimed": true, 00:17:36.259 "claim_type": "exclusive_write", 00:17:36.259 "zoned": false, 00:17:36.259 "supported_io_types": { 00:17:36.259 "read": true, 00:17:36.259 "write": true, 00:17:36.259 "unmap": true, 00:17:36.259 "flush": true, 00:17:36.259 "reset": true, 00:17:36.259 "nvme_admin": false, 00:17:36.259 "nvme_io": false, 00:17:36.259 "nvme_io_md": false, 00:17:36.259 "write_zeroes": true, 00:17:36.259 "zcopy": true, 00:17:36.259 "get_zone_info": false, 00:17:36.259 "zone_management": false, 00:17:36.259 "zone_append": false, 00:17:36.259 "compare": false, 00:17:36.259 "compare_and_write": false, 00:17:36.259 "abort": true, 00:17:36.259 "seek_hole": false, 00:17:36.259 "seek_data": false, 00:17:36.259 "copy": true, 00:17:36.259 "nvme_iov_md": false 00:17:36.259 }, 00:17:36.259 "memory_domains": [ 00:17:36.259 { 00:17:36.259 "dma_device_id": "system", 00:17:36.259 "dma_device_type": 1 00:17:36.259 }, 00:17:36.259 { 00:17:36.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.259 "dma_device_type": 2 00:17:36.259 } 00:17:36.259 ], 00:17:36.259 "driver_specific": {} 00:17:36.259 } 00:17:36.259 ] 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.259 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.518 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.518 "name": "Existed_Raid", 00:17:36.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.518 "strip_size_kb": 64, 00:17:36.518 "state": "configuring", 00:17:36.518 "raid_level": "raid0", 00:17:36.518 "superblock": false, 00:17:36.518 "num_base_bdevs": 4, 00:17:36.518 "num_base_bdevs_discovered": 2, 00:17:36.518 "num_base_bdevs_operational": 4, 00:17:36.518 "base_bdevs_list": [ 00:17:36.518 { 00:17:36.518 "name": "BaseBdev1", 00:17:36.518 "uuid": "fc47addd-d736-464c-9588-6fb00c741fbe", 00:17:36.518 "is_configured": true, 00:17:36.518 "data_offset": 0, 00:17:36.518 "data_size": 65536 00:17:36.518 }, 00:17:36.518 { 00:17:36.518 "name": "BaseBdev2", 00:17:36.518 "uuid": "d403994a-3b15-4a68-b8e7-9722507b03c6", 00:17:36.518 "is_configured": true, 00:17:36.518 "data_offset": 0, 00:17:36.518 "data_size": 65536 00:17:36.518 }, 00:17:36.518 { 00:17:36.518 "name": "BaseBdev3", 00:17:36.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.518 "is_configured": false, 00:17:36.518 "data_offset": 0, 00:17:36.518 "data_size": 0 00:17:36.518 }, 00:17:36.518 { 00:17:36.518 "name": "BaseBdev4", 00:17:36.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.518 "is_configured": false, 00:17:36.518 "data_offset": 0, 00:17:36.518 "data_size": 0 00:17:36.518 } 00:17:36.518 ] 00:17:36.518 }' 00:17:36.518 10:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.518 10:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.085 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:37.085 [2024-07-12 10:44:12.244896] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:37.085 BaseBdev3 00:17:37.085 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:37.085 10:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:37.085 10:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:37.085 10:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:37.085 10:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:37.085 10:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:37.085 10:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.344 10:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:37.603 [ 00:17:37.603 { 00:17:37.603 "name": "BaseBdev3", 00:17:37.603 "aliases": [ 00:17:37.603 "4cc78ac8-cc8c-4e79-9898-ea77bd8be7e3" 00:17:37.603 ], 00:17:37.603 "product_name": "Malloc disk", 00:17:37.603 "block_size": 512, 00:17:37.603 "num_blocks": 65536, 00:17:37.603 "uuid": "4cc78ac8-cc8c-4e79-9898-ea77bd8be7e3", 00:17:37.603 "assigned_rate_limits": { 00:17:37.603 "rw_ios_per_sec": 0, 00:17:37.603 "rw_mbytes_per_sec": 0, 00:17:37.603 "r_mbytes_per_sec": 0, 00:17:37.603 "w_mbytes_per_sec": 0 00:17:37.603 }, 00:17:37.603 "claimed": true, 00:17:37.603 "claim_type": "exclusive_write", 00:17:37.603 "zoned": false, 00:17:37.603 "supported_io_types": { 00:17:37.603 "read": true, 00:17:37.603 "write": true, 00:17:37.603 "unmap": true, 00:17:37.603 "flush": true, 00:17:37.603 "reset": true, 00:17:37.603 "nvme_admin": false, 00:17:37.603 "nvme_io": false, 00:17:37.603 "nvme_io_md": false, 00:17:37.603 "write_zeroes": true, 00:17:37.603 "zcopy": true, 00:17:37.603 "get_zone_info": false, 00:17:37.603 "zone_management": false, 00:17:37.603 "zone_append": false, 00:17:37.603 "compare": false, 00:17:37.603 "compare_and_write": false, 00:17:37.603 "abort": true, 00:17:37.603 "seek_hole": false, 00:17:37.603 "seek_data": false, 00:17:37.603 "copy": true, 00:17:37.603 "nvme_iov_md": false 00:17:37.603 }, 00:17:37.603 "memory_domains": [ 00:17:37.603 { 00:17:37.603 "dma_device_id": "system", 00:17:37.603 "dma_device_type": 1 00:17:37.603 }, 00:17:37.603 { 00:17:37.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.603 "dma_device_type": 2 00:17:37.603 } 00:17:37.603 ], 00:17:37.603 "driver_specific": {} 00:17:37.603 } 00:17:37.603 ] 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.603 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.862 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.862 "name": "Existed_Raid", 00:17:37.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.862 "strip_size_kb": 64, 00:17:37.862 "state": "configuring", 00:17:37.862 "raid_level": "raid0", 00:17:37.862 "superblock": false, 00:17:37.862 "num_base_bdevs": 4, 00:17:37.862 "num_base_bdevs_discovered": 3, 00:17:37.862 "num_base_bdevs_operational": 4, 00:17:37.862 "base_bdevs_list": [ 00:17:37.862 { 00:17:37.862 "name": "BaseBdev1", 00:17:37.862 "uuid": "fc47addd-d736-464c-9588-6fb00c741fbe", 00:17:37.862 "is_configured": true, 00:17:37.862 "data_offset": 0, 00:17:37.862 "data_size": 65536 00:17:37.862 }, 00:17:37.862 { 00:17:37.862 "name": "BaseBdev2", 00:17:37.862 "uuid": "d403994a-3b15-4a68-b8e7-9722507b03c6", 00:17:37.862 "is_configured": true, 00:17:37.862 "data_offset": 0, 00:17:37.862 "data_size": 65536 00:17:37.862 }, 00:17:37.862 { 00:17:37.862 "name": "BaseBdev3", 00:17:37.862 "uuid": "4cc78ac8-cc8c-4e79-9898-ea77bd8be7e3", 00:17:37.862 "is_configured": true, 00:17:37.862 "data_offset": 0, 00:17:37.862 "data_size": 65536 00:17:37.862 }, 00:17:37.862 { 00:17:37.862 "name": "BaseBdev4", 00:17:37.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.862 "is_configured": false, 00:17:37.862 "data_offset": 0, 00:17:37.862 "data_size": 0 00:17:37.862 } 00:17:37.862 ] 00:17:37.862 }' 00:17:37.862 10:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.862 10:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.428 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:38.428 [2024-07-12 10:44:13.563772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:38.428 [2024-07-12 10:44:13.563810] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x248f350 00:17:38.428 [2024-07-12 10:44:13.563818] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:38.428 [2024-07-12 10:44:13.564076] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x248f020 00:17:38.428 [2024-07-12 10:44:13.564196] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x248f350 00:17:38.428 [2024-07-12 10:44:13.564206] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x248f350 00:17:38.428 [2024-07-12 10:44:13.564372] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:38.428 BaseBdev4 00:17:38.428 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:38.428 10:44:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:38.428 10:44:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:38.428 10:44:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:38.428 10:44:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:38.428 10:44:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:38.428 10:44:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:38.686 10:44:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:38.945 [ 00:17:38.945 { 00:17:38.945 "name": "BaseBdev4", 00:17:38.945 "aliases": [ 00:17:38.945 "88f72884-0961-4c5c-854e-a703420ec633" 00:17:38.945 ], 00:17:38.945 "product_name": "Malloc disk", 00:17:38.945 "block_size": 512, 00:17:38.945 "num_blocks": 65536, 00:17:38.945 "uuid": "88f72884-0961-4c5c-854e-a703420ec633", 00:17:38.945 "assigned_rate_limits": { 00:17:38.945 "rw_ios_per_sec": 0, 00:17:38.945 "rw_mbytes_per_sec": 0, 00:17:38.945 "r_mbytes_per_sec": 0, 00:17:38.945 "w_mbytes_per_sec": 0 00:17:38.945 }, 00:17:38.945 "claimed": true, 00:17:38.945 "claim_type": "exclusive_write", 00:17:38.945 "zoned": false, 00:17:38.945 "supported_io_types": { 00:17:38.945 "read": true, 00:17:38.945 "write": true, 00:17:38.945 "unmap": true, 00:17:38.945 "flush": true, 00:17:38.945 "reset": true, 00:17:38.945 "nvme_admin": false, 00:17:38.945 "nvme_io": false, 00:17:38.945 "nvme_io_md": false, 00:17:38.945 "write_zeroes": true, 00:17:38.945 "zcopy": true, 00:17:38.945 "get_zone_info": false, 00:17:38.945 "zone_management": false, 00:17:38.945 "zone_append": false, 00:17:38.945 "compare": false, 00:17:38.945 "compare_and_write": false, 00:17:38.945 "abort": true, 00:17:38.945 "seek_hole": false, 00:17:38.945 "seek_data": false, 00:17:38.945 "copy": true, 00:17:38.945 "nvme_iov_md": false 00:17:38.945 }, 00:17:38.945 "memory_domains": [ 00:17:38.945 { 00:17:38.945 "dma_device_id": "system", 00:17:38.945 "dma_device_type": 1 00:17:38.945 }, 00:17:38.945 { 00:17:38.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.945 "dma_device_type": 2 00:17:38.945 } 00:17:38.945 ], 00:17:38.945 "driver_specific": {} 00:17:38.945 } 00:17:38.945 ] 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.945 10:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.205 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.205 "name": "Existed_Raid", 00:17:39.205 "uuid": "a0da38f7-13dd-4e38-890b-7c318485993a", 00:17:39.205 "strip_size_kb": 64, 00:17:39.205 "state": "online", 00:17:39.205 "raid_level": "raid0", 00:17:39.205 "superblock": false, 00:17:39.205 "num_base_bdevs": 4, 00:17:39.205 "num_base_bdevs_discovered": 4, 00:17:39.205 "num_base_bdevs_operational": 4, 00:17:39.205 "base_bdevs_list": [ 00:17:39.205 { 00:17:39.205 "name": "BaseBdev1", 00:17:39.205 "uuid": "fc47addd-d736-464c-9588-6fb00c741fbe", 00:17:39.205 "is_configured": true, 00:17:39.205 "data_offset": 0, 00:17:39.205 "data_size": 65536 00:17:39.205 }, 00:17:39.205 { 00:17:39.205 "name": "BaseBdev2", 00:17:39.205 "uuid": "d403994a-3b15-4a68-b8e7-9722507b03c6", 00:17:39.205 "is_configured": true, 00:17:39.205 "data_offset": 0, 00:17:39.205 "data_size": 65536 00:17:39.205 }, 00:17:39.205 { 00:17:39.205 "name": "BaseBdev3", 00:17:39.205 "uuid": "4cc78ac8-cc8c-4e79-9898-ea77bd8be7e3", 00:17:39.205 "is_configured": true, 00:17:39.205 "data_offset": 0, 00:17:39.206 "data_size": 65536 00:17:39.206 }, 00:17:39.206 { 00:17:39.206 "name": "BaseBdev4", 00:17:39.206 "uuid": "88f72884-0961-4c5c-854e-a703420ec633", 00:17:39.206 "is_configured": true, 00:17:39.206 "data_offset": 0, 00:17:39.206 "data_size": 65536 00:17:39.206 } 00:17:39.206 ] 00:17:39.206 }' 00:17:39.206 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.206 10:44:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.773 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:39.773 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:39.773 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:39.773 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:39.773 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:39.773 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:39.773 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:39.773 10:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:40.032 [2024-07-12 10:44:15.023977] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:40.032 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:40.032 "name": "Existed_Raid", 00:17:40.032 "aliases": [ 00:17:40.032 "a0da38f7-13dd-4e38-890b-7c318485993a" 00:17:40.032 ], 00:17:40.032 "product_name": "Raid Volume", 00:17:40.032 "block_size": 512, 00:17:40.032 "num_blocks": 262144, 00:17:40.032 "uuid": "a0da38f7-13dd-4e38-890b-7c318485993a", 00:17:40.032 "assigned_rate_limits": { 00:17:40.032 "rw_ios_per_sec": 0, 00:17:40.032 "rw_mbytes_per_sec": 0, 00:17:40.032 "r_mbytes_per_sec": 0, 00:17:40.032 "w_mbytes_per_sec": 0 00:17:40.032 }, 00:17:40.032 "claimed": false, 00:17:40.032 "zoned": false, 00:17:40.032 "supported_io_types": { 00:17:40.032 "read": true, 00:17:40.032 "write": true, 00:17:40.032 "unmap": true, 00:17:40.032 "flush": true, 00:17:40.032 "reset": true, 00:17:40.032 "nvme_admin": false, 00:17:40.032 "nvme_io": false, 00:17:40.032 "nvme_io_md": false, 00:17:40.032 "write_zeroes": true, 00:17:40.032 "zcopy": false, 00:17:40.032 "get_zone_info": false, 00:17:40.032 "zone_management": false, 00:17:40.032 "zone_append": false, 00:17:40.032 "compare": false, 00:17:40.032 "compare_and_write": false, 00:17:40.032 "abort": false, 00:17:40.032 "seek_hole": false, 00:17:40.032 "seek_data": false, 00:17:40.032 "copy": false, 00:17:40.032 "nvme_iov_md": false 00:17:40.032 }, 00:17:40.032 "memory_domains": [ 00:17:40.032 { 00:17:40.032 "dma_device_id": "system", 00:17:40.032 "dma_device_type": 1 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.032 "dma_device_type": 2 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "dma_device_id": "system", 00:17:40.032 "dma_device_type": 1 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.032 "dma_device_type": 2 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "dma_device_id": "system", 00:17:40.032 "dma_device_type": 1 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.032 "dma_device_type": 2 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "dma_device_id": "system", 00:17:40.032 "dma_device_type": 1 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.032 "dma_device_type": 2 00:17:40.032 } 00:17:40.032 ], 00:17:40.032 "driver_specific": { 00:17:40.032 "raid": { 00:17:40.032 "uuid": "a0da38f7-13dd-4e38-890b-7c318485993a", 00:17:40.032 "strip_size_kb": 64, 00:17:40.032 "state": "online", 00:17:40.032 "raid_level": "raid0", 00:17:40.032 "superblock": false, 00:17:40.032 "num_base_bdevs": 4, 00:17:40.032 "num_base_bdevs_discovered": 4, 00:17:40.032 "num_base_bdevs_operational": 4, 00:17:40.032 "base_bdevs_list": [ 00:17:40.032 { 00:17:40.032 "name": "BaseBdev1", 00:17:40.032 "uuid": "fc47addd-d736-464c-9588-6fb00c741fbe", 00:17:40.032 "is_configured": true, 00:17:40.032 "data_offset": 0, 00:17:40.032 "data_size": 65536 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "name": "BaseBdev2", 00:17:40.032 "uuid": "d403994a-3b15-4a68-b8e7-9722507b03c6", 00:17:40.032 "is_configured": true, 00:17:40.032 "data_offset": 0, 00:17:40.032 "data_size": 65536 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "name": "BaseBdev3", 00:17:40.032 "uuid": "4cc78ac8-cc8c-4e79-9898-ea77bd8be7e3", 00:17:40.032 "is_configured": true, 00:17:40.032 "data_offset": 0, 00:17:40.032 "data_size": 65536 00:17:40.032 }, 00:17:40.032 { 00:17:40.032 "name": "BaseBdev4", 00:17:40.032 "uuid": "88f72884-0961-4c5c-854e-a703420ec633", 00:17:40.032 "is_configured": true, 00:17:40.032 "data_offset": 0, 00:17:40.032 "data_size": 65536 00:17:40.032 } 00:17:40.032 ] 00:17:40.032 } 00:17:40.032 } 00:17:40.032 }' 00:17:40.032 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:40.032 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:40.032 BaseBdev2 00:17:40.032 BaseBdev3 00:17:40.032 BaseBdev4' 00:17:40.032 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:40.032 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:40.032 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:40.291 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:40.291 "name": "BaseBdev1", 00:17:40.291 "aliases": [ 00:17:40.291 "fc47addd-d736-464c-9588-6fb00c741fbe" 00:17:40.291 ], 00:17:40.291 "product_name": "Malloc disk", 00:17:40.291 "block_size": 512, 00:17:40.291 "num_blocks": 65536, 00:17:40.291 "uuid": "fc47addd-d736-464c-9588-6fb00c741fbe", 00:17:40.291 "assigned_rate_limits": { 00:17:40.291 "rw_ios_per_sec": 0, 00:17:40.291 "rw_mbytes_per_sec": 0, 00:17:40.291 "r_mbytes_per_sec": 0, 00:17:40.291 "w_mbytes_per_sec": 0 00:17:40.291 }, 00:17:40.291 "claimed": true, 00:17:40.291 "claim_type": "exclusive_write", 00:17:40.291 "zoned": false, 00:17:40.291 "supported_io_types": { 00:17:40.291 "read": true, 00:17:40.291 "write": true, 00:17:40.291 "unmap": true, 00:17:40.291 "flush": true, 00:17:40.291 "reset": true, 00:17:40.291 "nvme_admin": false, 00:17:40.291 "nvme_io": false, 00:17:40.291 "nvme_io_md": false, 00:17:40.291 "write_zeroes": true, 00:17:40.291 "zcopy": true, 00:17:40.291 "get_zone_info": false, 00:17:40.291 "zone_management": false, 00:17:40.291 "zone_append": false, 00:17:40.291 "compare": false, 00:17:40.291 "compare_and_write": false, 00:17:40.291 "abort": true, 00:17:40.291 "seek_hole": false, 00:17:40.291 "seek_data": false, 00:17:40.291 "copy": true, 00:17:40.291 "nvme_iov_md": false 00:17:40.291 }, 00:17:40.291 "memory_domains": [ 00:17:40.291 { 00:17:40.291 "dma_device_id": "system", 00:17:40.291 "dma_device_type": 1 00:17:40.291 }, 00:17:40.291 { 00:17:40.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.291 "dma_device_type": 2 00:17:40.291 } 00:17:40.292 ], 00:17:40.292 "driver_specific": {} 00:17:40.292 }' 00:17:40.292 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.292 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.292 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:40.292 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.292 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:40.550 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:40.808 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:40.808 "name": "BaseBdev2", 00:17:40.808 "aliases": [ 00:17:40.808 "d403994a-3b15-4a68-b8e7-9722507b03c6" 00:17:40.808 ], 00:17:40.808 "product_name": "Malloc disk", 00:17:40.808 "block_size": 512, 00:17:40.808 "num_blocks": 65536, 00:17:40.808 "uuid": "d403994a-3b15-4a68-b8e7-9722507b03c6", 00:17:40.808 "assigned_rate_limits": { 00:17:40.808 "rw_ios_per_sec": 0, 00:17:40.808 "rw_mbytes_per_sec": 0, 00:17:40.808 "r_mbytes_per_sec": 0, 00:17:40.808 "w_mbytes_per_sec": 0 00:17:40.808 }, 00:17:40.808 "claimed": true, 00:17:40.808 "claim_type": "exclusive_write", 00:17:40.808 "zoned": false, 00:17:40.808 "supported_io_types": { 00:17:40.808 "read": true, 00:17:40.808 "write": true, 00:17:40.808 "unmap": true, 00:17:40.808 "flush": true, 00:17:40.808 "reset": true, 00:17:40.808 "nvme_admin": false, 00:17:40.808 "nvme_io": false, 00:17:40.808 "nvme_io_md": false, 00:17:40.808 "write_zeroes": true, 00:17:40.808 "zcopy": true, 00:17:40.808 "get_zone_info": false, 00:17:40.808 "zone_management": false, 00:17:40.808 "zone_append": false, 00:17:40.808 "compare": false, 00:17:40.808 "compare_and_write": false, 00:17:40.808 "abort": true, 00:17:40.808 "seek_hole": false, 00:17:40.808 "seek_data": false, 00:17:40.808 "copy": true, 00:17:40.808 "nvme_iov_md": false 00:17:40.808 }, 00:17:40.808 "memory_domains": [ 00:17:40.808 { 00:17:40.808 "dma_device_id": "system", 00:17:40.808 "dma_device_type": 1 00:17:40.808 }, 00:17:40.808 { 00:17:40.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.808 "dma_device_type": 2 00:17:40.808 } 00:17:40.808 ], 00:17:40.808 "driver_specific": {} 00:17:40.808 }' 00:17:40.808 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.808 10:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.066 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:41.066 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.066 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.066 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:41.066 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.066 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.066 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:41.066 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.066 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.324 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:41.324 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:41.324 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:41.324 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:41.582 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:41.582 "name": "BaseBdev3", 00:17:41.582 "aliases": [ 00:17:41.582 "4cc78ac8-cc8c-4e79-9898-ea77bd8be7e3" 00:17:41.582 ], 00:17:41.582 "product_name": "Malloc disk", 00:17:41.582 "block_size": 512, 00:17:41.582 "num_blocks": 65536, 00:17:41.582 "uuid": "4cc78ac8-cc8c-4e79-9898-ea77bd8be7e3", 00:17:41.582 "assigned_rate_limits": { 00:17:41.582 "rw_ios_per_sec": 0, 00:17:41.582 "rw_mbytes_per_sec": 0, 00:17:41.582 "r_mbytes_per_sec": 0, 00:17:41.582 "w_mbytes_per_sec": 0 00:17:41.582 }, 00:17:41.582 "claimed": true, 00:17:41.582 "claim_type": "exclusive_write", 00:17:41.582 "zoned": false, 00:17:41.582 "supported_io_types": { 00:17:41.582 "read": true, 00:17:41.582 "write": true, 00:17:41.582 "unmap": true, 00:17:41.582 "flush": true, 00:17:41.582 "reset": true, 00:17:41.582 "nvme_admin": false, 00:17:41.582 "nvme_io": false, 00:17:41.582 "nvme_io_md": false, 00:17:41.582 "write_zeroes": true, 00:17:41.582 "zcopy": true, 00:17:41.582 "get_zone_info": false, 00:17:41.582 "zone_management": false, 00:17:41.583 "zone_append": false, 00:17:41.583 "compare": false, 00:17:41.583 "compare_and_write": false, 00:17:41.583 "abort": true, 00:17:41.583 "seek_hole": false, 00:17:41.583 "seek_data": false, 00:17:41.583 "copy": true, 00:17:41.583 "nvme_iov_md": false 00:17:41.583 }, 00:17:41.583 "memory_domains": [ 00:17:41.583 { 00:17:41.583 "dma_device_id": "system", 00:17:41.583 "dma_device_type": 1 00:17:41.583 }, 00:17:41.583 { 00:17:41.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.583 "dma_device_type": 2 00:17:41.583 } 00:17:41.583 ], 00:17:41.583 "driver_specific": {} 00:17:41.583 }' 00:17:41.583 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.583 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.583 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:41.583 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.583 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.583 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:41.583 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.583 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.841 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:41.841 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.841 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.841 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:41.841 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:41.841 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:41.841 10:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.099 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:42.099 "name": "BaseBdev4", 00:17:42.099 "aliases": [ 00:17:42.099 "88f72884-0961-4c5c-854e-a703420ec633" 00:17:42.099 ], 00:17:42.099 "product_name": "Malloc disk", 00:17:42.099 "block_size": 512, 00:17:42.099 "num_blocks": 65536, 00:17:42.099 "uuid": "88f72884-0961-4c5c-854e-a703420ec633", 00:17:42.099 "assigned_rate_limits": { 00:17:42.099 "rw_ios_per_sec": 0, 00:17:42.099 "rw_mbytes_per_sec": 0, 00:17:42.099 "r_mbytes_per_sec": 0, 00:17:42.099 "w_mbytes_per_sec": 0 00:17:42.099 }, 00:17:42.099 "claimed": true, 00:17:42.099 "claim_type": "exclusive_write", 00:17:42.099 "zoned": false, 00:17:42.099 "supported_io_types": { 00:17:42.099 "read": true, 00:17:42.099 "write": true, 00:17:42.099 "unmap": true, 00:17:42.099 "flush": true, 00:17:42.099 "reset": true, 00:17:42.099 "nvme_admin": false, 00:17:42.099 "nvme_io": false, 00:17:42.099 "nvme_io_md": false, 00:17:42.099 "write_zeroes": true, 00:17:42.099 "zcopy": true, 00:17:42.099 "get_zone_info": false, 00:17:42.099 "zone_management": false, 00:17:42.099 "zone_append": false, 00:17:42.099 "compare": false, 00:17:42.099 "compare_and_write": false, 00:17:42.099 "abort": true, 00:17:42.099 "seek_hole": false, 00:17:42.099 "seek_data": false, 00:17:42.099 "copy": true, 00:17:42.099 "nvme_iov_md": false 00:17:42.099 }, 00:17:42.099 "memory_domains": [ 00:17:42.099 { 00:17:42.099 "dma_device_id": "system", 00:17:42.099 "dma_device_type": 1 00:17:42.099 }, 00:17:42.099 { 00:17:42.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.099 "dma_device_type": 2 00:17:42.099 } 00:17:42.099 ], 00:17:42.099 "driver_specific": {} 00:17:42.099 }' 00:17:42.099 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.099 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.099 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:42.099 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.099 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.099 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:42.358 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.358 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.358 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:42.358 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.358 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.358 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:42.358 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:42.615 [2024-07-12 10:44:17.686792] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:42.615 [2024-07-12 10:44:17.686826] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:42.615 [2024-07-12 10:44:17.686876] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.616 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.873 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.873 "name": "Existed_Raid", 00:17:42.873 "uuid": "a0da38f7-13dd-4e38-890b-7c318485993a", 00:17:42.873 "strip_size_kb": 64, 00:17:42.873 "state": "offline", 00:17:42.873 "raid_level": "raid0", 00:17:42.873 "superblock": false, 00:17:42.873 "num_base_bdevs": 4, 00:17:42.873 "num_base_bdevs_discovered": 3, 00:17:42.873 "num_base_bdevs_operational": 3, 00:17:42.873 "base_bdevs_list": [ 00:17:42.873 { 00:17:42.873 "name": null, 00:17:42.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.873 "is_configured": false, 00:17:42.873 "data_offset": 0, 00:17:42.873 "data_size": 65536 00:17:42.873 }, 00:17:42.873 { 00:17:42.873 "name": "BaseBdev2", 00:17:42.873 "uuid": "d403994a-3b15-4a68-b8e7-9722507b03c6", 00:17:42.873 "is_configured": true, 00:17:42.873 "data_offset": 0, 00:17:42.873 "data_size": 65536 00:17:42.873 }, 00:17:42.873 { 00:17:42.873 "name": "BaseBdev3", 00:17:42.873 "uuid": "4cc78ac8-cc8c-4e79-9898-ea77bd8be7e3", 00:17:42.873 "is_configured": true, 00:17:42.873 "data_offset": 0, 00:17:42.873 "data_size": 65536 00:17:42.873 }, 00:17:42.873 { 00:17:42.873 "name": "BaseBdev4", 00:17:42.873 "uuid": "88f72884-0961-4c5c-854e-a703420ec633", 00:17:42.873 "is_configured": true, 00:17:42.873 "data_offset": 0, 00:17:42.873 "data_size": 65536 00:17:42.873 } 00:17:42.873 ] 00:17:42.873 }' 00:17:42.873 10:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.874 10:44:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.439 10:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:43.439 10:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:43.439 10:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.439 10:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:43.697 10:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:43.697 10:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:43.697 10:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:43.955 [2024-07-12 10:44:19.036410] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:43.955 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:43.955 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:43.955 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.955 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:44.213 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:44.213 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:44.213 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:44.471 [2024-07-12 10:44:19.550269] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:44.471 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:44.471 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:44.471 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.471 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:44.729 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:44.729 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:44.729 10:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:44.988 [2024-07-12 10:44:20.059961] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:44.988 [2024-07-12 10:44:20.060016] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x248f350 name Existed_Raid, state offline 00:17:44.988 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:44.988 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:44.988 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:44.988 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.245 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:45.245 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:45.245 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:45.245 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:45.245 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:45.245 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:45.503 BaseBdev2 00:17:45.503 10:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:45.503 10:44:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:45.503 10:44:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:45.503 10:44:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:45.503 10:44:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:45.503 10:44:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:45.503 10:44:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:45.761 10:44:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:46.060 [ 00:17:46.060 { 00:17:46.060 "name": "BaseBdev2", 00:17:46.060 "aliases": [ 00:17:46.060 "c7c68433-97b2-4454-bfbc-f7bbb890cf9d" 00:17:46.060 ], 00:17:46.060 "product_name": "Malloc disk", 00:17:46.060 "block_size": 512, 00:17:46.060 "num_blocks": 65536, 00:17:46.060 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:46.060 "assigned_rate_limits": { 00:17:46.060 "rw_ios_per_sec": 0, 00:17:46.060 "rw_mbytes_per_sec": 0, 00:17:46.060 "r_mbytes_per_sec": 0, 00:17:46.060 "w_mbytes_per_sec": 0 00:17:46.060 }, 00:17:46.060 "claimed": false, 00:17:46.060 "zoned": false, 00:17:46.060 "supported_io_types": { 00:17:46.060 "read": true, 00:17:46.060 "write": true, 00:17:46.060 "unmap": true, 00:17:46.060 "flush": true, 00:17:46.060 "reset": true, 00:17:46.060 "nvme_admin": false, 00:17:46.060 "nvme_io": false, 00:17:46.060 "nvme_io_md": false, 00:17:46.060 "write_zeroes": true, 00:17:46.060 "zcopy": true, 00:17:46.060 "get_zone_info": false, 00:17:46.060 "zone_management": false, 00:17:46.060 "zone_append": false, 00:17:46.060 "compare": false, 00:17:46.060 "compare_and_write": false, 00:17:46.060 "abort": true, 00:17:46.060 "seek_hole": false, 00:17:46.060 "seek_data": false, 00:17:46.060 "copy": true, 00:17:46.060 "nvme_iov_md": false 00:17:46.060 }, 00:17:46.060 "memory_domains": [ 00:17:46.060 { 00:17:46.060 "dma_device_id": "system", 00:17:46.060 "dma_device_type": 1 00:17:46.060 }, 00:17:46.060 { 00:17:46.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.060 "dma_device_type": 2 00:17:46.060 } 00:17:46.060 ], 00:17:46.060 "driver_specific": {} 00:17:46.060 } 00:17:46.060 ] 00:17:46.060 10:44:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:46.060 10:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:46.060 10:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:46.060 10:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:46.345 BaseBdev3 00:17:46.345 10:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:46.345 10:44:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:46.345 10:44:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:46.345 10:44:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:46.345 10:44:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:46.345 10:44:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:46.345 10:44:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:46.603 10:44:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:46.603 [ 00:17:46.603 { 00:17:46.603 "name": "BaseBdev3", 00:17:46.603 "aliases": [ 00:17:46.603 "0a64c4ac-71ca-4925-9503-12c5120274aa" 00:17:46.603 ], 00:17:46.603 "product_name": "Malloc disk", 00:17:46.603 "block_size": 512, 00:17:46.603 "num_blocks": 65536, 00:17:46.603 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:46.603 "assigned_rate_limits": { 00:17:46.603 "rw_ios_per_sec": 0, 00:17:46.603 "rw_mbytes_per_sec": 0, 00:17:46.603 "r_mbytes_per_sec": 0, 00:17:46.603 "w_mbytes_per_sec": 0 00:17:46.603 }, 00:17:46.603 "claimed": false, 00:17:46.603 "zoned": false, 00:17:46.603 "supported_io_types": { 00:17:46.603 "read": true, 00:17:46.603 "write": true, 00:17:46.603 "unmap": true, 00:17:46.603 "flush": true, 00:17:46.603 "reset": true, 00:17:46.603 "nvme_admin": false, 00:17:46.603 "nvme_io": false, 00:17:46.603 "nvme_io_md": false, 00:17:46.603 "write_zeroes": true, 00:17:46.603 "zcopy": true, 00:17:46.603 "get_zone_info": false, 00:17:46.603 "zone_management": false, 00:17:46.603 "zone_append": false, 00:17:46.603 "compare": false, 00:17:46.603 "compare_and_write": false, 00:17:46.603 "abort": true, 00:17:46.603 "seek_hole": false, 00:17:46.603 "seek_data": false, 00:17:46.603 "copy": true, 00:17:46.603 "nvme_iov_md": false 00:17:46.603 }, 00:17:46.603 "memory_domains": [ 00:17:46.603 { 00:17:46.603 "dma_device_id": "system", 00:17:46.603 "dma_device_type": 1 00:17:46.603 }, 00:17:46.603 { 00:17:46.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.603 "dma_device_type": 2 00:17:46.603 } 00:17:46.603 ], 00:17:46.603 "driver_specific": {} 00:17:46.603 } 00:17:46.603 ] 00:17:46.603 10:44:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:46.603 10:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:46.603 10:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:46.603 10:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:46.861 BaseBdev4 00:17:47.119 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:47.119 10:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:47.119 10:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:47.119 10:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:47.119 10:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:47.119 10:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:47.119 10:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:47.119 10:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:47.378 [ 00:17:47.378 { 00:17:47.378 "name": "BaseBdev4", 00:17:47.378 "aliases": [ 00:17:47.378 "cb774dda-24be-4d60-a24a-283f31c23f41" 00:17:47.378 ], 00:17:47.378 "product_name": "Malloc disk", 00:17:47.378 "block_size": 512, 00:17:47.378 "num_blocks": 65536, 00:17:47.378 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:47.378 "assigned_rate_limits": { 00:17:47.378 "rw_ios_per_sec": 0, 00:17:47.378 "rw_mbytes_per_sec": 0, 00:17:47.378 "r_mbytes_per_sec": 0, 00:17:47.378 "w_mbytes_per_sec": 0 00:17:47.378 }, 00:17:47.378 "claimed": false, 00:17:47.378 "zoned": false, 00:17:47.378 "supported_io_types": { 00:17:47.378 "read": true, 00:17:47.378 "write": true, 00:17:47.378 "unmap": true, 00:17:47.378 "flush": true, 00:17:47.378 "reset": true, 00:17:47.378 "nvme_admin": false, 00:17:47.378 "nvme_io": false, 00:17:47.378 "nvme_io_md": false, 00:17:47.378 "write_zeroes": true, 00:17:47.378 "zcopy": true, 00:17:47.378 "get_zone_info": false, 00:17:47.378 "zone_management": false, 00:17:47.378 "zone_append": false, 00:17:47.378 "compare": false, 00:17:47.378 "compare_and_write": false, 00:17:47.378 "abort": true, 00:17:47.378 "seek_hole": false, 00:17:47.378 "seek_data": false, 00:17:47.378 "copy": true, 00:17:47.378 "nvme_iov_md": false 00:17:47.378 }, 00:17:47.378 "memory_domains": [ 00:17:47.378 { 00:17:47.378 "dma_device_id": "system", 00:17:47.378 "dma_device_type": 1 00:17:47.378 }, 00:17:47.378 { 00:17:47.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.378 "dma_device_type": 2 00:17:47.378 } 00:17:47.378 ], 00:17:47.378 "driver_specific": {} 00:17:47.378 } 00:17:47.378 ] 00:17:47.378 10:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:47.378 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:47.378 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:47.378 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:47.637 [2024-07-12 10:44:22.757028] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:47.637 [2024-07-12 10:44:22.757073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:47.637 [2024-07-12 10:44:22.757093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:47.637 [2024-07-12 10:44:22.758419] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:47.637 [2024-07-12 10:44:22.758463] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.637 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:47.895 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:47.895 "name": "Existed_Raid", 00:17:47.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.895 "strip_size_kb": 64, 00:17:47.895 "state": "configuring", 00:17:47.895 "raid_level": "raid0", 00:17:47.895 "superblock": false, 00:17:47.895 "num_base_bdevs": 4, 00:17:47.895 "num_base_bdevs_discovered": 3, 00:17:47.895 "num_base_bdevs_operational": 4, 00:17:47.895 "base_bdevs_list": [ 00:17:47.895 { 00:17:47.895 "name": "BaseBdev1", 00:17:47.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.895 "is_configured": false, 00:17:47.895 "data_offset": 0, 00:17:47.895 "data_size": 0 00:17:47.895 }, 00:17:47.895 { 00:17:47.895 "name": "BaseBdev2", 00:17:47.895 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:47.895 "is_configured": true, 00:17:47.895 "data_offset": 0, 00:17:47.895 "data_size": 65536 00:17:47.895 }, 00:17:47.895 { 00:17:47.895 "name": "BaseBdev3", 00:17:47.895 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:47.895 "is_configured": true, 00:17:47.895 "data_offset": 0, 00:17:47.895 "data_size": 65536 00:17:47.895 }, 00:17:47.895 { 00:17:47.895 "name": "BaseBdev4", 00:17:47.895 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:47.895 "is_configured": true, 00:17:47.895 "data_offset": 0, 00:17:47.895 "data_size": 65536 00:17:47.895 } 00:17:47.895 ] 00:17:47.895 }' 00:17:47.895 10:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:47.895 10:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.829 10:44:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:49.088 [2024-07-12 10:44:24.036399] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.088 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:49.347 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.347 "name": "Existed_Raid", 00:17:49.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.347 "strip_size_kb": 64, 00:17:49.347 "state": "configuring", 00:17:49.347 "raid_level": "raid0", 00:17:49.347 "superblock": false, 00:17:49.347 "num_base_bdevs": 4, 00:17:49.347 "num_base_bdevs_discovered": 2, 00:17:49.347 "num_base_bdevs_operational": 4, 00:17:49.347 "base_bdevs_list": [ 00:17:49.347 { 00:17:49.347 "name": "BaseBdev1", 00:17:49.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.347 "is_configured": false, 00:17:49.347 "data_offset": 0, 00:17:49.347 "data_size": 0 00:17:49.347 }, 00:17:49.347 { 00:17:49.347 "name": null, 00:17:49.347 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:49.347 "is_configured": false, 00:17:49.347 "data_offset": 0, 00:17:49.347 "data_size": 65536 00:17:49.347 }, 00:17:49.347 { 00:17:49.347 "name": "BaseBdev3", 00:17:49.347 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:49.347 "is_configured": true, 00:17:49.347 "data_offset": 0, 00:17:49.347 "data_size": 65536 00:17:49.347 }, 00:17:49.347 { 00:17:49.347 "name": "BaseBdev4", 00:17:49.347 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:49.347 "is_configured": true, 00:17:49.347 "data_offset": 0, 00:17:49.347 "data_size": 65536 00:17:49.347 } 00:17:49.347 ] 00:17:49.347 }' 00:17:49.347 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.347 10:44:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.912 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.912 10:44:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:50.170 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:50.170 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:50.428 [2024-07-12 10:44:25.371461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:50.428 BaseBdev1 00:17:50.428 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:50.428 10:44:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:50.428 10:44:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:50.428 10:44:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:50.428 10:44:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:50.428 10:44:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:50.428 10:44:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:50.686 10:44:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:50.686 [ 00:17:50.686 { 00:17:50.686 "name": "BaseBdev1", 00:17:50.686 "aliases": [ 00:17:50.686 "7204142f-3555-4c9c-b68f-facad24876ef" 00:17:50.686 ], 00:17:50.686 "product_name": "Malloc disk", 00:17:50.686 "block_size": 512, 00:17:50.686 "num_blocks": 65536, 00:17:50.686 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:50.686 "assigned_rate_limits": { 00:17:50.686 "rw_ios_per_sec": 0, 00:17:50.686 "rw_mbytes_per_sec": 0, 00:17:50.686 "r_mbytes_per_sec": 0, 00:17:50.686 "w_mbytes_per_sec": 0 00:17:50.686 }, 00:17:50.686 "claimed": true, 00:17:50.686 "claim_type": "exclusive_write", 00:17:50.686 "zoned": false, 00:17:50.686 "supported_io_types": { 00:17:50.686 "read": true, 00:17:50.686 "write": true, 00:17:50.686 "unmap": true, 00:17:50.686 "flush": true, 00:17:50.686 "reset": true, 00:17:50.686 "nvme_admin": false, 00:17:50.686 "nvme_io": false, 00:17:50.686 "nvme_io_md": false, 00:17:50.686 "write_zeroes": true, 00:17:50.686 "zcopy": true, 00:17:50.686 "get_zone_info": false, 00:17:50.686 "zone_management": false, 00:17:50.686 "zone_append": false, 00:17:50.686 "compare": false, 00:17:50.686 "compare_and_write": false, 00:17:50.686 "abort": true, 00:17:50.686 "seek_hole": false, 00:17:50.686 "seek_data": false, 00:17:50.686 "copy": true, 00:17:50.686 "nvme_iov_md": false 00:17:50.686 }, 00:17:50.686 "memory_domains": [ 00:17:50.686 { 00:17:50.686 "dma_device_id": "system", 00:17:50.686 "dma_device_type": 1 00:17:50.686 }, 00:17:50.686 { 00:17:50.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.686 "dma_device_type": 2 00:17:50.686 } 00:17:50.686 ], 00:17:50.686 "driver_specific": {} 00:17:50.686 } 00:17:50.686 ] 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.944 10:44:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:50.944 10:44:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.944 "name": "Existed_Raid", 00:17:50.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.944 "strip_size_kb": 64, 00:17:50.944 "state": "configuring", 00:17:50.944 "raid_level": "raid0", 00:17:50.944 "superblock": false, 00:17:50.944 "num_base_bdevs": 4, 00:17:50.944 "num_base_bdevs_discovered": 3, 00:17:50.944 "num_base_bdevs_operational": 4, 00:17:50.944 "base_bdevs_list": [ 00:17:50.944 { 00:17:50.944 "name": "BaseBdev1", 00:17:50.944 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:50.944 "is_configured": true, 00:17:50.944 "data_offset": 0, 00:17:50.944 "data_size": 65536 00:17:50.944 }, 00:17:50.944 { 00:17:50.944 "name": null, 00:17:50.944 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:50.944 "is_configured": false, 00:17:50.944 "data_offset": 0, 00:17:50.944 "data_size": 65536 00:17:50.944 }, 00:17:50.944 { 00:17:50.944 "name": "BaseBdev3", 00:17:50.944 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:50.944 "is_configured": true, 00:17:50.944 "data_offset": 0, 00:17:50.944 "data_size": 65536 00:17:50.944 }, 00:17:50.944 { 00:17:50.944 "name": "BaseBdev4", 00:17:50.944 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:50.944 "is_configured": true, 00:17:50.944 "data_offset": 0, 00:17:50.944 "data_size": 65536 00:17:50.944 } 00:17:50.944 ] 00:17:50.944 }' 00:17:50.944 10:44:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.944 10:44:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.876 10:44:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.876 10:44:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:51.876 10:44:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:51.877 10:44:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:52.135 [2024-07-12 10:44:27.212383] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.135 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.393 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.393 "name": "Existed_Raid", 00:17:52.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.393 "strip_size_kb": 64, 00:17:52.393 "state": "configuring", 00:17:52.393 "raid_level": "raid0", 00:17:52.393 "superblock": false, 00:17:52.393 "num_base_bdevs": 4, 00:17:52.393 "num_base_bdevs_discovered": 2, 00:17:52.393 "num_base_bdevs_operational": 4, 00:17:52.393 "base_bdevs_list": [ 00:17:52.393 { 00:17:52.393 "name": "BaseBdev1", 00:17:52.393 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:52.393 "is_configured": true, 00:17:52.393 "data_offset": 0, 00:17:52.393 "data_size": 65536 00:17:52.393 }, 00:17:52.393 { 00:17:52.393 "name": null, 00:17:52.393 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:52.393 "is_configured": false, 00:17:52.393 "data_offset": 0, 00:17:52.393 "data_size": 65536 00:17:52.393 }, 00:17:52.393 { 00:17:52.393 "name": null, 00:17:52.393 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:52.393 "is_configured": false, 00:17:52.393 "data_offset": 0, 00:17:52.393 "data_size": 65536 00:17:52.393 }, 00:17:52.393 { 00:17:52.393 "name": "BaseBdev4", 00:17:52.393 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:52.393 "is_configured": true, 00:17:52.393 "data_offset": 0, 00:17:52.393 "data_size": 65536 00:17:52.393 } 00:17:52.393 ] 00:17:52.393 }' 00:17:52.393 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.393 10:44:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.960 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.960 10:44:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:53.217 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:53.218 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:53.476 [2024-07-12 10:44:28.459933] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.476 "name": "Existed_Raid", 00:17:53.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.476 "strip_size_kb": 64, 00:17:53.476 "state": "configuring", 00:17:53.476 "raid_level": "raid0", 00:17:53.476 "superblock": false, 00:17:53.476 "num_base_bdevs": 4, 00:17:53.476 "num_base_bdevs_discovered": 3, 00:17:53.476 "num_base_bdevs_operational": 4, 00:17:53.476 "base_bdevs_list": [ 00:17:53.476 { 00:17:53.476 "name": "BaseBdev1", 00:17:53.476 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:53.476 "is_configured": true, 00:17:53.476 "data_offset": 0, 00:17:53.476 "data_size": 65536 00:17:53.476 }, 00:17:53.476 { 00:17:53.476 "name": null, 00:17:53.476 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:53.476 "is_configured": false, 00:17:53.476 "data_offset": 0, 00:17:53.476 "data_size": 65536 00:17:53.476 }, 00:17:53.476 { 00:17:53.476 "name": "BaseBdev3", 00:17:53.476 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:53.476 "is_configured": true, 00:17:53.476 "data_offset": 0, 00:17:53.476 "data_size": 65536 00:17:53.476 }, 00:17:53.476 { 00:17:53.476 "name": "BaseBdev4", 00:17:53.476 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:53.476 "is_configured": true, 00:17:53.476 "data_offset": 0, 00:17:53.476 "data_size": 65536 00:17:53.476 } 00:17:53.476 ] 00:17:53.476 }' 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.476 10:44:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.410 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.410 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:54.410 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:54.410 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:54.666 [2024-07-12 10:44:29.643092] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:54.666 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:54.666 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.667 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.924 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.924 "name": "Existed_Raid", 00:17:54.924 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.924 "strip_size_kb": 64, 00:17:54.924 "state": "configuring", 00:17:54.924 "raid_level": "raid0", 00:17:54.924 "superblock": false, 00:17:54.924 "num_base_bdevs": 4, 00:17:54.924 "num_base_bdevs_discovered": 2, 00:17:54.924 "num_base_bdevs_operational": 4, 00:17:54.924 "base_bdevs_list": [ 00:17:54.924 { 00:17:54.924 "name": null, 00:17:54.924 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:54.924 "is_configured": false, 00:17:54.924 "data_offset": 0, 00:17:54.924 "data_size": 65536 00:17:54.924 }, 00:17:54.924 { 00:17:54.924 "name": null, 00:17:54.924 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:54.924 "is_configured": false, 00:17:54.924 "data_offset": 0, 00:17:54.924 "data_size": 65536 00:17:54.924 }, 00:17:54.924 { 00:17:54.924 "name": "BaseBdev3", 00:17:54.924 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:54.924 "is_configured": true, 00:17:54.924 "data_offset": 0, 00:17:54.924 "data_size": 65536 00:17:54.924 }, 00:17:54.924 { 00:17:54.924 "name": "BaseBdev4", 00:17:54.924 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:54.924 "is_configured": true, 00:17:54.924 "data_offset": 0, 00:17:54.924 "data_size": 65536 00:17:54.924 } 00:17:54.924 ] 00:17:54.924 }' 00:17:54.924 10:44:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.924 10:44:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.488 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.488 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:55.746 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:55.746 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:55.746 [2024-07-12 10:44:30.934946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.004 10:44:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.262 10:44:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.262 "name": "Existed_Raid", 00:17:56.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.262 "strip_size_kb": 64, 00:17:56.262 "state": "configuring", 00:17:56.262 "raid_level": "raid0", 00:17:56.262 "superblock": false, 00:17:56.262 "num_base_bdevs": 4, 00:17:56.262 "num_base_bdevs_discovered": 3, 00:17:56.262 "num_base_bdevs_operational": 4, 00:17:56.262 "base_bdevs_list": [ 00:17:56.262 { 00:17:56.262 "name": null, 00:17:56.262 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:56.262 "is_configured": false, 00:17:56.262 "data_offset": 0, 00:17:56.262 "data_size": 65536 00:17:56.262 }, 00:17:56.262 { 00:17:56.262 "name": "BaseBdev2", 00:17:56.262 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:56.262 "is_configured": true, 00:17:56.262 "data_offset": 0, 00:17:56.262 "data_size": 65536 00:17:56.262 }, 00:17:56.262 { 00:17:56.262 "name": "BaseBdev3", 00:17:56.262 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:56.262 "is_configured": true, 00:17:56.262 "data_offset": 0, 00:17:56.262 "data_size": 65536 00:17:56.262 }, 00:17:56.262 { 00:17:56.262 "name": "BaseBdev4", 00:17:56.262 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:56.262 "is_configured": true, 00:17:56.262 "data_offset": 0, 00:17:56.262 "data_size": 65536 00:17:56.262 } 00:17:56.262 ] 00:17:56.262 }' 00:17:56.262 10:44:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.262 10:44:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.827 10:44:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.827 10:44:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:57.085 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:57.085 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.085 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:57.342 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7204142f-3555-4c9c-b68f-facad24876ef 00:17:57.600 [2024-07-12 10:44:32.539755] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:57.600 [2024-07-12 10:44:32.539795] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2493040 00:17:57.600 [2024-07-12 10:44:32.539804] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:57.600 [2024-07-12 10:44:32.540005] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x248ea70 00:17:57.600 [2024-07-12 10:44:32.540119] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2493040 00:17:57.600 [2024-07-12 10:44:32.540129] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2493040 00:17:57.600 [2024-07-12 10:44:32.540291] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:57.600 NewBaseBdev 00:17:57.600 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:57.600 10:44:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:57.600 10:44:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:57.600 10:44:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:57.600 10:44:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:57.600 10:44:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:57.600 10:44:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:57.600 10:44:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:57.858 [ 00:17:57.858 { 00:17:57.858 "name": "NewBaseBdev", 00:17:57.858 "aliases": [ 00:17:57.858 "7204142f-3555-4c9c-b68f-facad24876ef" 00:17:57.858 ], 00:17:57.858 "product_name": "Malloc disk", 00:17:57.858 "block_size": 512, 00:17:57.858 "num_blocks": 65536, 00:17:57.858 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:57.858 "assigned_rate_limits": { 00:17:57.858 "rw_ios_per_sec": 0, 00:17:57.858 "rw_mbytes_per_sec": 0, 00:17:57.858 "r_mbytes_per_sec": 0, 00:17:57.858 "w_mbytes_per_sec": 0 00:17:57.858 }, 00:17:57.858 "claimed": true, 00:17:57.858 "claim_type": "exclusive_write", 00:17:57.858 "zoned": false, 00:17:57.858 "supported_io_types": { 00:17:57.858 "read": true, 00:17:57.858 "write": true, 00:17:57.858 "unmap": true, 00:17:57.858 "flush": true, 00:17:57.858 "reset": true, 00:17:57.858 "nvme_admin": false, 00:17:57.858 "nvme_io": false, 00:17:57.858 "nvme_io_md": false, 00:17:57.858 "write_zeroes": true, 00:17:57.858 "zcopy": true, 00:17:57.858 "get_zone_info": false, 00:17:57.858 "zone_management": false, 00:17:57.858 "zone_append": false, 00:17:57.858 "compare": false, 00:17:57.858 "compare_and_write": false, 00:17:57.858 "abort": true, 00:17:57.858 "seek_hole": false, 00:17:57.858 "seek_data": false, 00:17:57.858 "copy": true, 00:17:57.858 "nvme_iov_md": false 00:17:57.858 }, 00:17:57.858 "memory_domains": [ 00:17:57.858 { 00:17:57.858 "dma_device_id": "system", 00:17:57.858 "dma_device_type": 1 00:17:57.858 }, 00:17:57.858 { 00:17:57.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.858 "dma_device_type": 2 00:17:57.858 } 00:17:57.858 ], 00:17:57.858 "driver_specific": {} 00:17:57.858 } 00:17:57.858 ] 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.858 10:44:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.116 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.116 "name": "Existed_Raid", 00:17:58.116 "uuid": "1dc8d0d2-655a-43b9-816c-9b7c92209b88", 00:17:58.116 "strip_size_kb": 64, 00:17:58.116 "state": "online", 00:17:58.116 "raid_level": "raid0", 00:17:58.116 "superblock": false, 00:17:58.116 "num_base_bdevs": 4, 00:17:58.116 "num_base_bdevs_discovered": 4, 00:17:58.116 "num_base_bdevs_operational": 4, 00:17:58.116 "base_bdevs_list": [ 00:17:58.116 { 00:17:58.116 "name": "NewBaseBdev", 00:17:58.116 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:58.116 "is_configured": true, 00:17:58.116 "data_offset": 0, 00:17:58.116 "data_size": 65536 00:17:58.116 }, 00:17:58.116 { 00:17:58.116 "name": "BaseBdev2", 00:17:58.116 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:58.116 "is_configured": true, 00:17:58.116 "data_offset": 0, 00:17:58.116 "data_size": 65536 00:17:58.116 }, 00:17:58.116 { 00:17:58.116 "name": "BaseBdev3", 00:17:58.116 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:58.116 "is_configured": true, 00:17:58.116 "data_offset": 0, 00:17:58.116 "data_size": 65536 00:17:58.116 }, 00:17:58.116 { 00:17:58.116 "name": "BaseBdev4", 00:17:58.116 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:58.116 "is_configured": true, 00:17:58.116 "data_offset": 0, 00:17:58.116 "data_size": 65536 00:17:58.116 } 00:17:58.116 ] 00:17:58.116 }' 00:17:58.116 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.116 10:44:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.683 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:58.683 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:58.683 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:58.683 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:58.683 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:58.683 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:58.683 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:58.683 10:44:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:58.942 [2024-07-12 10:44:33.999981] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:58.942 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:58.942 "name": "Existed_Raid", 00:17:58.942 "aliases": [ 00:17:58.942 "1dc8d0d2-655a-43b9-816c-9b7c92209b88" 00:17:58.942 ], 00:17:58.942 "product_name": "Raid Volume", 00:17:58.942 "block_size": 512, 00:17:58.942 "num_blocks": 262144, 00:17:58.942 "uuid": "1dc8d0d2-655a-43b9-816c-9b7c92209b88", 00:17:58.942 "assigned_rate_limits": { 00:17:58.942 "rw_ios_per_sec": 0, 00:17:58.942 "rw_mbytes_per_sec": 0, 00:17:58.942 "r_mbytes_per_sec": 0, 00:17:58.942 "w_mbytes_per_sec": 0 00:17:58.942 }, 00:17:58.942 "claimed": false, 00:17:58.942 "zoned": false, 00:17:58.942 "supported_io_types": { 00:17:58.942 "read": true, 00:17:58.942 "write": true, 00:17:58.942 "unmap": true, 00:17:58.942 "flush": true, 00:17:58.942 "reset": true, 00:17:58.942 "nvme_admin": false, 00:17:58.942 "nvme_io": false, 00:17:58.942 "nvme_io_md": false, 00:17:58.942 "write_zeroes": true, 00:17:58.942 "zcopy": false, 00:17:58.942 "get_zone_info": false, 00:17:58.942 "zone_management": false, 00:17:58.942 "zone_append": false, 00:17:58.942 "compare": false, 00:17:58.942 "compare_and_write": false, 00:17:58.942 "abort": false, 00:17:58.942 "seek_hole": false, 00:17:58.942 "seek_data": false, 00:17:58.942 "copy": false, 00:17:58.942 "nvme_iov_md": false 00:17:58.942 }, 00:17:58.942 "memory_domains": [ 00:17:58.942 { 00:17:58.942 "dma_device_id": "system", 00:17:58.942 "dma_device_type": 1 00:17:58.942 }, 00:17:58.942 { 00:17:58.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.942 "dma_device_type": 2 00:17:58.942 }, 00:17:58.942 { 00:17:58.942 "dma_device_id": "system", 00:17:58.943 "dma_device_type": 1 00:17:58.943 }, 00:17:58.943 { 00:17:58.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.943 "dma_device_type": 2 00:17:58.943 }, 00:17:58.943 { 00:17:58.943 "dma_device_id": "system", 00:17:58.943 "dma_device_type": 1 00:17:58.943 }, 00:17:58.943 { 00:17:58.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.943 "dma_device_type": 2 00:17:58.943 }, 00:17:58.943 { 00:17:58.943 "dma_device_id": "system", 00:17:58.943 "dma_device_type": 1 00:17:58.943 }, 00:17:58.943 { 00:17:58.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.943 "dma_device_type": 2 00:17:58.943 } 00:17:58.943 ], 00:17:58.943 "driver_specific": { 00:17:58.943 "raid": { 00:17:58.943 "uuid": "1dc8d0d2-655a-43b9-816c-9b7c92209b88", 00:17:58.943 "strip_size_kb": 64, 00:17:58.943 "state": "online", 00:17:58.943 "raid_level": "raid0", 00:17:58.943 "superblock": false, 00:17:58.943 "num_base_bdevs": 4, 00:17:58.943 "num_base_bdevs_discovered": 4, 00:17:58.943 "num_base_bdevs_operational": 4, 00:17:58.943 "base_bdevs_list": [ 00:17:58.943 { 00:17:58.943 "name": "NewBaseBdev", 00:17:58.943 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:58.943 "is_configured": true, 00:17:58.943 "data_offset": 0, 00:17:58.943 "data_size": 65536 00:17:58.943 }, 00:17:58.943 { 00:17:58.943 "name": "BaseBdev2", 00:17:58.943 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:58.943 "is_configured": true, 00:17:58.943 "data_offset": 0, 00:17:58.943 "data_size": 65536 00:17:58.943 }, 00:17:58.943 { 00:17:58.943 "name": "BaseBdev3", 00:17:58.943 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:17:58.943 "is_configured": true, 00:17:58.943 "data_offset": 0, 00:17:58.943 "data_size": 65536 00:17:58.943 }, 00:17:58.943 { 00:17:58.943 "name": "BaseBdev4", 00:17:58.943 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:17:58.943 "is_configured": true, 00:17:58.943 "data_offset": 0, 00:17:58.943 "data_size": 65536 00:17:58.943 } 00:17:58.943 ] 00:17:58.943 } 00:17:58.943 } 00:17:58.943 }' 00:17:58.943 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:58.943 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:58.943 BaseBdev2 00:17:58.943 BaseBdev3 00:17:58.943 BaseBdev4' 00:17:58.943 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:58.943 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:58.943 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:59.200 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:59.200 "name": "NewBaseBdev", 00:17:59.200 "aliases": [ 00:17:59.200 "7204142f-3555-4c9c-b68f-facad24876ef" 00:17:59.200 ], 00:17:59.200 "product_name": "Malloc disk", 00:17:59.200 "block_size": 512, 00:17:59.200 "num_blocks": 65536, 00:17:59.200 "uuid": "7204142f-3555-4c9c-b68f-facad24876ef", 00:17:59.200 "assigned_rate_limits": { 00:17:59.200 "rw_ios_per_sec": 0, 00:17:59.200 "rw_mbytes_per_sec": 0, 00:17:59.200 "r_mbytes_per_sec": 0, 00:17:59.200 "w_mbytes_per_sec": 0 00:17:59.200 }, 00:17:59.200 "claimed": true, 00:17:59.200 "claim_type": "exclusive_write", 00:17:59.200 "zoned": false, 00:17:59.200 "supported_io_types": { 00:17:59.200 "read": true, 00:17:59.200 "write": true, 00:17:59.200 "unmap": true, 00:17:59.200 "flush": true, 00:17:59.200 "reset": true, 00:17:59.200 "nvme_admin": false, 00:17:59.200 "nvme_io": false, 00:17:59.200 "nvme_io_md": false, 00:17:59.200 "write_zeroes": true, 00:17:59.200 "zcopy": true, 00:17:59.200 "get_zone_info": false, 00:17:59.200 "zone_management": false, 00:17:59.200 "zone_append": false, 00:17:59.201 "compare": false, 00:17:59.201 "compare_and_write": false, 00:17:59.201 "abort": true, 00:17:59.201 "seek_hole": false, 00:17:59.201 "seek_data": false, 00:17:59.201 "copy": true, 00:17:59.201 "nvme_iov_md": false 00:17:59.201 }, 00:17:59.201 "memory_domains": [ 00:17:59.201 { 00:17:59.201 "dma_device_id": "system", 00:17:59.201 "dma_device_type": 1 00:17:59.201 }, 00:17:59.201 { 00:17:59.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.201 "dma_device_type": 2 00:17:59.201 } 00:17:59.201 ], 00:17:59.201 "driver_specific": {} 00:17:59.201 }' 00:17:59.201 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.201 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.201 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:59.201 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.458 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.458 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:59.458 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.458 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.458 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:59.458 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.458 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.716 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.716 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:59.716 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:59.716 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:59.974 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:59.974 "name": "BaseBdev2", 00:17:59.974 "aliases": [ 00:17:59.974 "c7c68433-97b2-4454-bfbc-f7bbb890cf9d" 00:17:59.974 ], 00:17:59.974 "product_name": "Malloc disk", 00:17:59.974 "block_size": 512, 00:17:59.974 "num_blocks": 65536, 00:17:59.974 "uuid": "c7c68433-97b2-4454-bfbc-f7bbb890cf9d", 00:17:59.974 "assigned_rate_limits": { 00:17:59.974 "rw_ios_per_sec": 0, 00:17:59.974 "rw_mbytes_per_sec": 0, 00:17:59.974 "r_mbytes_per_sec": 0, 00:17:59.974 "w_mbytes_per_sec": 0 00:17:59.974 }, 00:17:59.974 "claimed": true, 00:17:59.974 "claim_type": "exclusive_write", 00:17:59.974 "zoned": false, 00:17:59.974 "supported_io_types": { 00:17:59.974 "read": true, 00:17:59.974 "write": true, 00:17:59.974 "unmap": true, 00:17:59.974 "flush": true, 00:17:59.974 "reset": true, 00:17:59.974 "nvme_admin": false, 00:17:59.974 "nvme_io": false, 00:17:59.974 "nvme_io_md": false, 00:17:59.974 "write_zeroes": true, 00:17:59.974 "zcopy": true, 00:17:59.974 "get_zone_info": false, 00:17:59.974 "zone_management": false, 00:17:59.974 "zone_append": false, 00:17:59.974 "compare": false, 00:17:59.974 "compare_and_write": false, 00:17:59.974 "abort": true, 00:17:59.974 "seek_hole": false, 00:17:59.974 "seek_data": false, 00:17:59.974 "copy": true, 00:17:59.974 "nvme_iov_md": false 00:17:59.974 }, 00:17:59.974 "memory_domains": [ 00:17:59.974 { 00:17:59.974 "dma_device_id": "system", 00:17:59.974 "dma_device_type": 1 00:17:59.974 }, 00:17:59.974 { 00:17:59.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.974 "dma_device_type": 2 00:17:59.974 } 00:17:59.974 ], 00:17:59.974 "driver_specific": {} 00:17:59.974 }' 00:17:59.974 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.974 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.974 10:44:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:59.974 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.974 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.974 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:59.974 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.974 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.232 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:00.232 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.232 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.232 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:00.232 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:00.232 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:00.232 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:00.490 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:00.490 "name": "BaseBdev3", 00:18:00.490 "aliases": [ 00:18:00.490 "0a64c4ac-71ca-4925-9503-12c5120274aa" 00:18:00.490 ], 00:18:00.490 "product_name": "Malloc disk", 00:18:00.490 "block_size": 512, 00:18:00.490 "num_blocks": 65536, 00:18:00.490 "uuid": "0a64c4ac-71ca-4925-9503-12c5120274aa", 00:18:00.490 "assigned_rate_limits": { 00:18:00.490 "rw_ios_per_sec": 0, 00:18:00.490 "rw_mbytes_per_sec": 0, 00:18:00.490 "r_mbytes_per_sec": 0, 00:18:00.490 "w_mbytes_per_sec": 0 00:18:00.490 }, 00:18:00.490 "claimed": true, 00:18:00.490 "claim_type": "exclusive_write", 00:18:00.490 "zoned": false, 00:18:00.490 "supported_io_types": { 00:18:00.490 "read": true, 00:18:00.490 "write": true, 00:18:00.490 "unmap": true, 00:18:00.490 "flush": true, 00:18:00.490 "reset": true, 00:18:00.490 "nvme_admin": false, 00:18:00.490 "nvme_io": false, 00:18:00.490 "nvme_io_md": false, 00:18:00.490 "write_zeroes": true, 00:18:00.490 "zcopy": true, 00:18:00.490 "get_zone_info": false, 00:18:00.490 "zone_management": false, 00:18:00.490 "zone_append": false, 00:18:00.490 "compare": false, 00:18:00.490 "compare_and_write": false, 00:18:00.490 "abort": true, 00:18:00.490 "seek_hole": false, 00:18:00.490 "seek_data": false, 00:18:00.490 "copy": true, 00:18:00.490 "nvme_iov_md": false 00:18:00.490 }, 00:18:00.490 "memory_domains": [ 00:18:00.490 { 00:18:00.490 "dma_device_id": "system", 00:18:00.490 "dma_device_type": 1 00:18:00.490 }, 00:18:00.490 { 00:18:00.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.490 "dma_device_type": 2 00:18:00.490 } 00:18:00.490 ], 00:18:00.490 "driver_specific": {} 00:18:00.490 }' 00:18:00.491 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.491 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.491 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:00.491 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.491 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:00.748 10:44:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:01.007 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:01.007 "name": "BaseBdev4", 00:18:01.007 "aliases": [ 00:18:01.007 "cb774dda-24be-4d60-a24a-283f31c23f41" 00:18:01.007 ], 00:18:01.007 "product_name": "Malloc disk", 00:18:01.007 "block_size": 512, 00:18:01.007 "num_blocks": 65536, 00:18:01.007 "uuid": "cb774dda-24be-4d60-a24a-283f31c23f41", 00:18:01.007 "assigned_rate_limits": { 00:18:01.007 "rw_ios_per_sec": 0, 00:18:01.007 "rw_mbytes_per_sec": 0, 00:18:01.007 "r_mbytes_per_sec": 0, 00:18:01.007 "w_mbytes_per_sec": 0 00:18:01.007 }, 00:18:01.007 "claimed": true, 00:18:01.007 "claim_type": "exclusive_write", 00:18:01.007 "zoned": false, 00:18:01.007 "supported_io_types": { 00:18:01.007 "read": true, 00:18:01.007 "write": true, 00:18:01.007 "unmap": true, 00:18:01.007 "flush": true, 00:18:01.007 "reset": true, 00:18:01.007 "nvme_admin": false, 00:18:01.007 "nvme_io": false, 00:18:01.007 "nvme_io_md": false, 00:18:01.007 "write_zeroes": true, 00:18:01.007 "zcopy": true, 00:18:01.007 "get_zone_info": false, 00:18:01.007 "zone_management": false, 00:18:01.007 "zone_append": false, 00:18:01.007 "compare": false, 00:18:01.007 "compare_and_write": false, 00:18:01.007 "abort": true, 00:18:01.007 "seek_hole": false, 00:18:01.007 "seek_data": false, 00:18:01.007 "copy": true, 00:18:01.007 "nvme_iov_md": false 00:18:01.007 }, 00:18:01.007 "memory_domains": [ 00:18:01.007 { 00:18:01.007 "dma_device_id": "system", 00:18:01.007 "dma_device_type": 1 00:18:01.007 }, 00:18:01.007 { 00:18:01.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.007 "dma_device_type": 2 00:18:01.007 } 00:18:01.007 ], 00:18:01.007 "driver_specific": {} 00:18:01.007 }' 00:18:01.007 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.007 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.007 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:01.007 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.265 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.265 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:01.265 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.265 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.265 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:01.265 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.265 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.523 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:01.523 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:01.523 [2024-07-12 10:44:36.694790] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:01.523 [2024-07-12 10:44:36.694821] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:01.523 [2024-07-12 10:44:36.694874] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:01.523 [2024-07-12 10:44:36.694933] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:01.523 [2024-07-12 10:44:36.694945] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2493040 name Existed_Raid, state offline 00:18:01.523 10:44:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2075793 00:18:01.523 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2075793 ']' 00:18:01.523 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2075793 00:18:01.523 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:01.781 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:01.781 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2075793 00:18:01.781 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:01.781 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:01.781 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2075793' 00:18:01.781 killing process with pid 2075793 00:18:01.781 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2075793 00:18:01.781 [2024-07-12 10:44:36.760987] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:01.781 10:44:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2075793 00:18:01.781 [2024-07-12 10:44:36.799626] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:02.040 10:44:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:02.040 00:18:02.040 real 0m31.511s 00:18:02.040 user 0m57.809s 00:18:02.040 sys 0m5.695s 00:18:02.040 10:44:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.041 ************************************ 00:18:02.041 END TEST raid_state_function_test 00:18:02.041 ************************************ 00:18:02.041 10:44:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:02.041 10:44:37 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:02.041 10:44:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:02.041 10:44:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:02.041 10:44:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:02.041 ************************************ 00:18:02.041 START TEST raid_state_function_test_sb 00:18:02.041 ************************************ 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2080502 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2080502' 00:18:02.041 Process raid pid: 2080502 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2080502 /var/tmp/spdk-raid.sock 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2080502 ']' 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:02.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:02.041 10:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:02.041 [2024-07-12 10:44:37.154446] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:18:02.041 [2024-07-12 10:44:37.154527] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:02.300 [2024-07-12 10:44:37.284297] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:02.300 [2024-07-12 10:44:37.386146] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:02.300 [2024-07-12 10:44:37.447718] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:02.300 [2024-07-12 10:44:37.447754] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:02.913 10:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:02.913 10:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:02.914 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:03.190 [2024-07-12 10:44:38.220071] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:03.190 [2024-07-12 10:44:38.220113] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:03.190 [2024-07-12 10:44:38.220123] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:03.190 [2024-07-12 10:44:38.220135] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:03.190 [2024-07-12 10:44:38.220144] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:03.190 [2024-07-12 10:44:38.220155] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:03.190 [2024-07-12 10:44:38.220163] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:03.190 [2024-07-12 10:44:38.220174] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.190 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.759 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.759 "name": "Existed_Raid", 00:18:03.759 "uuid": "7e94bfdd-9a87-4cb3-b437-7721ae565fe0", 00:18:03.759 "strip_size_kb": 64, 00:18:03.759 "state": "configuring", 00:18:03.759 "raid_level": "raid0", 00:18:03.759 "superblock": true, 00:18:03.759 "num_base_bdevs": 4, 00:18:03.759 "num_base_bdevs_discovered": 0, 00:18:03.759 "num_base_bdevs_operational": 4, 00:18:03.759 "base_bdevs_list": [ 00:18:03.759 { 00:18:03.759 "name": "BaseBdev1", 00:18:03.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.759 "is_configured": false, 00:18:03.759 "data_offset": 0, 00:18:03.759 "data_size": 0 00:18:03.759 }, 00:18:03.759 { 00:18:03.759 "name": "BaseBdev2", 00:18:03.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.759 "is_configured": false, 00:18:03.759 "data_offset": 0, 00:18:03.759 "data_size": 0 00:18:03.759 }, 00:18:03.759 { 00:18:03.759 "name": "BaseBdev3", 00:18:03.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.759 "is_configured": false, 00:18:03.759 "data_offset": 0, 00:18:03.759 "data_size": 0 00:18:03.759 }, 00:18:03.759 { 00:18:03.759 "name": "BaseBdev4", 00:18:03.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.759 "is_configured": false, 00:18:03.759 "data_offset": 0, 00:18:03.759 "data_size": 0 00:18:03.759 } 00:18:03.759 ] 00:18:03.759 }' 00:18:03.759 10:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.759 10:44:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:04.327 10:44:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:04.584 [2024-07-12 10:44:39.571476] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:04.584 [2024-07-12 10:44:39.571512] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfdcaa0 name Existed_Raid, state configuring 00:18:04.584 10:44:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:04.841 [2024-07-12 10:44:39.816151] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:04.841 [2024-07-12 10:44:39.816180] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:04.841 [2024-07-12 10:44:39.816190] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:04.841 [2024-07-12 10:44:39.816202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:04.841 [2024-07-12 10:44:39.816211] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:04.841 [2024-07-12 10:44:39.816222] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:04.841 [2024-07-12 10:44:39.816231] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:04.841 [2024-07-12 10:44:39.816242] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:04.841 10:44:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:05.099 [2024-07-12 10:44:40.066552] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:05.099 BaseBdev1 00:18:05.099 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:05.099 10:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:05.099 10:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:05.099 10:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:05.100 10:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:05.100 10:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:05.100 10:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:05.100 10:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:05.358 [ 00:18:05.358 { 00:18:05.358 "name": "BaseBdev1", 00:18:05.358 "aliases": [ 00:18:05.358 "25ddbed6-d51b-4cb1-9580-6f78784ac2cd" 00:18:05.358 ], 00:18:05.358 "product_name": "Malloc disk", 00:18:05.359 "block_size": 512, 00:18:05.359 "num_blocks": 65536, 00:18:05.359 "uuid": "25ddbed6-d51b-4cb1-9580-6f78784ac2cd", 00:18:05.359 "assigned_rate_limits": { 00:18:05.359 "rw_ios_per_sec": 0, 00:18:05.359 "rw_mbytes_per_sec": 0, 00:18:05.359 "r_mbytes_per_sec": 0, 00:18:05.359 "w_mbytes_per_sec": 0 00:18:05.359 }, 00:18:05.359 "claimed": true, 00:18:05.359 "claim_type": "exclusive_write", 00:18:05.359 "zoned": false, 00:18:05.359 "supported_io_types": { 00:18:05.359 "read": true, 00:18:05.359 "write": true, 00:18:05.359 "unmap": true, 00:18:05.359 "flush": true, 00:18:05.359 "reset": true, 00:18:05.359 "nvme_admin": false, 00:18:05.359 "nvme_io": false, 00:18:05.359 "nvme_io_md": false, 00:18:05.359 "write_zeroes": true, 00:18:05.359 "zcopy": true, 00:18:05.359 "get_zone_info": false, 00:18:05.359 "zone_management": false, 00:18:05.359 "zone_append": false, 00:18:05.359 "compare": false, 00:18:05.359 "compare_and_write": false, 00:18:05.359 "abort": true, 00:18:05.359 "seek_hole": false, 00:18:05.359 "seek_data": false, 00:18:05.359 "copy": true, 00:18:05.359 "nvme_iov_md": false 00:18:05.359 }, 00:18:05.359 "memory_domains": [ 00:18:05.359 { 00:18:05.359 "dma_device_id": "system", 00:18:05.359 "dma_device_type": 1 00:18:05.359 }, 00:18:05.359 { 00:18:05.359 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.359 "dma_device_type": 2 00:18:05.359 } 00:18:05.359 ], 00:18:05.359 "driver_specific": {} 00:18:05.359 } 00:18:05.359 ] 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.359 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:05.618 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.618 "name": "Existed_Raid", 00:18:05.618 "uuid": "a0753d96-99ef-44db-8773-0020b19dd172", 00:18:05.618 "strip_size_kb": 64, 00:18:05.618 "state": "configuring", 00:18:05.618 "raid_level": "raid0", 00:18:05.618 "superblock": true, 00:18:05.618 "num_base_bdevs": 4, 00:18:05.618 "num_base_bdevs_discovered": 1, 00:18:05.618 "num_base_bdevs_operational": 4, 00:18:05.618 "base_bdevs_list": [ 00:18:05.618 { 00:18:05.618 "name": "BaseBdev1", 00:18:05.618 "uuid": "25ddbed6-d51b-4cb1-9580-6f78784ac2cd", 00:18:05.618 "is_configured": true, 00:18:05.618 "data_offset": 2048, 00:18:05.618 "data_size": 63488 00:18:05.618 }, 00:18:05.618 { 00:18:05.618 "name": "BaseBdev2", 00:18:05.618 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.618 "is_configured": false, 00:18:05.618 "data_offset": 0, 00:18:05.618 "data_size": 0 00:18:05.618 }, 00:18:05.618 { 00:18:05.618 "name": "BaseBdev3", 00:18:05.618 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.618 "is_configured": false, 00:18:05.618 "data_offset": 0, 00:18:05.618 "data_size": 0 00:18:05.618 }, 00:18:05.618 { 00:18:05.618 "name": "BaseBdev4", 00:18:05.618 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.618 "is_configured": false, 00:18:05.618 "data_offset": 0, 00:18:05.618 "data_size": 0 00:18:05.618 } 00:18:05.618 ] 00:18:05.618 }' 00:18:05.618 10:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.618 10:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.230 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:06.488 [2024-07-12 10:44:41.498345] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:06.488 [2024-07-12 10:44:41.498382] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfdc310 name Existed_Raid, state configuring 00:18:06.488 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:06.746 [2024-07-12 10:44:41.743030] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:06.746 [2024-07-12 10:44:41.744561] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:06.746 [2024-07-12 10:44:41.744594] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:06.746 [2024-07-12 10:44:41.744605] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:06.746 [2024-07-12 10:44:41.744616] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:06.746 [2024-07-12 10:44:41.744626] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:06.746 [2024-07-12 10:44:41.744637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.746 10:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:07.004 10:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.004 "name": "Existed_Raid", 00:18:07.004 "uuid": "e07df5a4-fdcf-42a9-abd2-35b296cd53b5", 00:18:07.004 "strip_size_kb": 64, 00:18:07.004 "state": "configuring", 00:18:07.004 "raid_level": "raid0", 00:18:07.004 "superblock": true, 00:18:07.004 "num_base_bdevs": 4, 00:18:07.004 "num_base_bdevs_discovered": 1, 00:18:07.004 "num_base_bdevs_operational": 4, 00:18:07.004 "base_bdevs_list": [ 00:18:07.004 { 00:18:07.004 "name": "BaseBdev1", 00:18:07.004 "uuid": "25ddbed6-d51b-4cb1-9580-6f78784ac2cd", 00:18:07.004 "is_configured": true, 00:18:07.004 "data_offset": 2048, 00:18:07.004 "data_size": 63488 00:18:07.004 }, 00:18:07.004 { 00:18:07.004 "name": "BaseBdev2", 00:18:07.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.004 "is_configured": false, 00:18:07.004 "data_offset": 0, 00:18:07.004 "data_size": 0 00:18:07.004 }, 00:18:07.004 { 00:18:07.004 "name": "BaseBdev3", 00:18:07.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.004 "is_configured": false, 00:18:07.004 "data_offset": 0, 00:18:07.004 "data_size": 0 00:18:07.004 }, 00:18:07.004 { 00:18:07.004 "name": "BaseBdev4", 00:18:07.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.004 "is_configured": false, 00:18:07.004 "data_offset": 0, 00:18:07.004 "data_size": 0 00:18:07.004 } 00:18:07.004 ] 00:18:07.004 }' 00:18:07.004 10:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.004 10:44:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:07.571 10:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:07.830 [2024-07-12 10:44:42.849303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:07.830 BaseBdev2 00:18:07.830 10:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:07.830 10:44:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:07.830 10:44:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:07.830 10:44:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:07.830 10:44:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:07.830 10:44:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:07.830 10:44:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:08.088 10:44:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:08.346 [ 00:18:08.346 { 00:18:08.346 "name": "BaseBdev2", 00:18:08.346 "aliases": [ 00:18:08.346 "1fb90800-1db2-48ff-ab6f-45d5c97152ae" 00:18:08.346 ], 00:18:08.346 "product_name": "Malloc disk", 00:18:08.346 "block_size": 512, 00:18:08.346 "num_blocks": 65536, 00:18:08.346 "uuid": "1fb90800-1db2-48ff-ab6f-45d5c97152ae", 00:18:08.346 "assigned_rate_limits": { 00:18:08.346 "rw_ios_per_sec": 0, 00:18:08.346 "rw_mbytes_per_sec": 0, 00:18:08.346 "r_mbytes_per_sec": 0, 00:18:08.346 "w_mbytes_per_sec": 0 00:18:08.346 }, 00:18:08.346 "claimed": true, 00:18:08.346 "claim_type": "exclusive_write", 00:18:08.346 "zoned": false, 00:18:08.346 "supported_io_types": { 00:18:08.346 "read": true, 00:18:08.346 "write": true, 00:18:08.346 "unmap": true, 00:18:08.346 "flush": true, 00:18:08.346 "reset": true, 00:18:08.346 "nvme_admin": false, 00:18:08.346 "nvme_io": false, 00:18:08.346 "nvme_io_md": false, 00:18:08.346 "write_zeroes": true, 00:18:08.346 "zcopy": true, 00:18:08.346 "get_zone_info": false, 00:18:08.346 "zone_management": false, 00:18:08.346 "zone_append": false, 00:18:08.346 "compare": false, 00:18:08.346 "compare_and_write": false, 00:18:08.346 "abort": true, 00:18:08.346 "seek_hole": false, 00:18:08.346 "seek_data": false, 00:18:08.346 "copy": true, 00:18:08.346 "nvme_iov_md": false 00:18:08.346 }, 00:18:08.346 "memory_domains": [ 00:18:08.346 { 00:18:08.346 "dma_device_id": "system", 00:18:08.346 "dma_device_type": 1 00:18:08.346 }, 00:18:08.346 { 00:18:08.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:08.346 "dma_device_type": 2 00:18:08.346 } 00:18:08.346 ], 00:18:08.346 "driver_specific": {} 00:18:08.346 } 00:18:08.346 ] 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.346 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.605 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.605 "name": "Existed_Raid", 00:18:08.605 "uuid": "e07df5a4-fdcf-42a9-abd2-35b296cd53b5", 00:18:08.605 "strip_size_kb": 64, 00:18:08.605 "state": "configuring", 00:18:08.605 "raid_level": "raid0", 00:18:08.605 "superblock": true, 00:18:08.605 "num_base_bdevs": 4, 00:18:08.605 "num_base_bdevs_discovered": 2, 00:18:08.605 "num_base_bdevs_operational": 4, 00:18:08.605 "base_bdevs_list": [ 00:18:08.605 { 00:18:08.605 "name": "BaseBdev1", 00:18:08.605 "uuid": "25ddbed6-d51b-4cb1-9580-6f78784ac2cd", 00:18:08.605 "is_configured": true, 00:18:08.605 "data_offset": 2048, 00:18:08.605 "data_size": 63488 00:18:08.605 }, 00:18:08.605 { 00:18:08.605 "name": "BaseBdev2", 00:18:08.605 "uuid": "1fb90800-1db2-48ff-ab6f-45d5c97152ae", 00:18:08.605 "is_configured": true, 00:18:08.605 "data_offset": 2048, 00:18:08.605 "data_size": 63488 00:18:08.605 }, 00:18:08.605 { 00:18:08.605 "name": "BaseBdev3", 00:18:08.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.605 "is_configured": false, 00:18:08.605 "data_offset": 0, 00:18:08.605 "data_size": 0 00:18:08.605 }, 00:18:08.605 { 00:18:08.605 "name": "BaseBdev4", 00:18:08.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.605 "is_configured": false, 00:18:08.605 "data_offset": 0, 00:18:08.605 "data_size": 0 00:18:08.605 } 00:18:08.605 ] 00:18:08.605 }' 00:18:08.605 10:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.605 10:44:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:09.173 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:09.432 [2024-07-12 10:44:44.420873] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:09.432 BaseBdev3 00:18:09.432 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:09.432 10:44:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:09.432 10:44:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:09.432 10:44:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:09.432 10:44:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:09.432 10:44:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:09.432 10:44:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:09.691 10:44:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:09.951 [ 00:18:09.951 { 00:18:09.951 "name": "BaseBdev3", 00:18:09.951 "aliases": [ 00:18:09.951 "c2bebadb-e160-409d-95ba-96f3851dca03" 00:18:09.951 ], 00:18:09.951 "product_name": "Malloc disk", 00:18:09.951 "block_size": 512, 00:18:09.951 "num_blocks": 65536, 00:18:09.951 "uuid": "c2bebadb-e160-409d-95ba-96f3851dca03", 00:18:09.951 "assigned_rate_limits": { 00:18:09.951 "rw_ios_per_sec": 0, 00:18:09.951 "rw_mbytes_per_sec": 0, 00:18:09.951 "r_mbytes_per_sec": 0, 00:18:09.951 "w_mbytes_per_sec": 0 00:18:09.951 }, 00:18:09.951 "claimed": true, 00:18:09.951 "claim_type": "exclusive_write", 00:18:09.951 "zoned": false, 00:18:09.951 "supported_io_types": { 00:18:09.951 "read": true, 00:18:09.951 "write": true, 00:18:09.951 "unmap": true, 00:18:09.951 "flush": true, 00:18:09.951 "reset": true, 00:18:09.951 "nvme_admin": false, 00:18:09.951 "nvme_io": false, 00:18:09.951 "nvme_io_md": false, 00:18:09.951 "write_zeroes": true, 00:18:09.951 "zcopy": true, 00:18:09.951 "get_zone_info": false, 00:18:09.951 "zone_management": false, 00:18:09.951 "zone_append": false, 00:18:09.951 "compare": false, 00:18:09.951 "compare_and_write": false, 00:18:09.951 "abort": true, 00:18:09.951 "seek_hole": false, 00:18:09.951 "seek_data": false, 00:18:09.951 "copy": true, 00:18:09.951 "nvme_iov_md": false 00:18:09.951 }, 00:18:09.951 "memory_domains": [ 00:18:09.951 { 00:18:09.951 "dma_device_id": "system", 00:18:09.951 "dma_device_type": 1 00:18:09.951 }, 00:18:09.951 { 00:18:09.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.951 "dma_device_type": 2 00:18:09.951 } 00:18:09.951 ], 00:18:09.951 "driver_specific": {} 00:18:09.951 } 00:18:09.951 ] 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.951 10:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.211 10:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.211 "name": "Existed_Raid", 00:18:10.211 "uuid": "e07df5a4-fdcf-42a9-abd2-35b296cd53b5", 00:18:10.211 "strip_size_kb": 64, 00:18:10.211 "state": "configuring", 00:18:10.211 "raid_level": "raid0", 00:18:10.211 "superblock": true, 00:18:10.211 "num_base_bdevs": 4, 00:18:10.211 "num_base_bdevs_discovered": 3, 00:18:10.211 "num_base_bdevs_operational": 4, 00:18:10.211 "base_bdevs_list": [ 00:18:10.211 { 00:18:10.211 "name": "BaseBdev1", 00:18:10.211 "uuid": "25ddbed6-d51b-4cb1-9580-6f78784ac2cd", 00:18:10.211 "is_configured": true, 00:18:10.211 "data_offset": 2048, 00:18:10.211 "data_size": 63488 00:18:10.211 }, 00:18:10.211 { 00:18:10.211 "name": "BaseBdev2", 00:18:10.211 "uuid": "1fb90800-1db2-48ff-ab6f-45d5c97152ae", 00:18:10.211 "is_configured": true, 00:18:10.211 "data_offset": 2048, 00:18:10.211 "data_size": 63488 00:18:10.211 }, 00:18:10.211 { 00:18:10.211 "name": "BaseBdev3", 00:18:10.211 "uuid": "c2bebadb-e160-409d-95ba-96f3851dca03", 00:18:10.211 "is_configured": true, 00:18:10.211 "data_offset": 2048, 00:18:10.211 "data_size": 63488 00:18:10.211 }, 00:18:10.211 { 00:18:10.211 "name": "BaseBdev4", 00:18:10.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.211 "is_configured": false, 00:18:10.211 "data_offset": 0, 00:18:10.211 "data_size": 0 00:18:10.211 } 00:18:10.211 ] 00:18:10.211 }' 00:18:10.211 10:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.211 10:44:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:10.779 10:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:11.038 [2024-07-12 10:44:46.012558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:11.038 [2024-07-12 10:44:46.012726] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfdd350 00:18:11.038 [2024-07-12 10:44:46.012741] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:11.038 [2024-07-12 10:44:46.012909] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfdd020 00:18:11.038 [2024-07-12 10:44:46.013023] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfdd350 00:18:11.038 [2024-07-12 10:44:46.013039] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfdd350 00:18:11.038 [2024-07-12 10:44:46.013129] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:11.038 BaseBdev4 00:18:11.038 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:11.038 10:44:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:11.038 10:44:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:11.038 10:44:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:11.038 10:44:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:11.038 10:44:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:11.038 10:44:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:11.297 10:44:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:11.556 [ 00:18:11.556 { 00:18:11.556 "name": "BaseBdev4", 00:18:11.556 "aliases": [ 00:18:11.556 "4cef8db0-80aa-4417-b36b-bb4619daa656" 00:18:11.556 ], 00:18:11.556 "product_name": "Malloc disk", 00:18:11.556 "block_size": 512, 00:18:11.556 "num_blocks": 65536, 00:18:11.556 "uuid": "4cef8db0-80aa-4417-b36b-bb4619daa656", 00:18:11.556 "assigned_rate_limits": { 00:18:11.556 "rw_ios_per_sec": 0, 00:18:11.556 "rw_mbytes_per_sec": 0, 00:18:11.556 "r_mbytes_per_sec": 0, 00:18:11.556 "w_mbytes_per_sec": 0 00:18:11.556 }, 00:18:11.556 "claimed": true, 00:18:11.556 "claim_type": "exclusive_write", 00:18:11.556 "zoned": false, 00:18:11.556 "supported_io_types": { 00:18:11.556 "read": true, 00:18:11.556 "write": true, 00:18:11.556 "unmap": true, 00:18:11.556 "flush": true, 00:18:11.556 "reset": true, 00:18:11.556 "nvme_admin": false, 00:18:11.556 "nvme_io": false, 00:18:11.556 "nvme_io_md": false, 00:18:11.556 "write_zeroes": true, 00:18:11.556 "zcopy": true, 00:18:11.556 "get_zone_info": false, 00:18:11.556 "zone_management": false, 00:18:11.556 "zone_append": false, 00:18:11.556 "compare": false, 00:18:11.556 "compare_and_write": false, 00:18:11.556 "abort": true, 00:18:11.556 "seek_hole": false, 00:18:11.556 "seek_data": false, 00:18:11.556 "copy": true, 00:18:11.556 "nvme_iov_md": false 00:18:11.556 }, 00:18:11.556 "memory_domains": [ 00:18:11.556 { 00:18:11.556 "dma_device_id": "system", 00:18:11.556 "dma_device_type": 1 00:18:11.556 }, 00:18:11.556 { 00:18:11.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.556 "dma_device_type": 2 00:18:11.556 } 00:18:11.556 ], 00:18:11.556 "driver_specific": {} 00:18:11.556 } 00:18:11.556 ] 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.556 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.815 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.815 "name": "Existed_Raid", 00:18:11.815 "uuid": "e07df5a4-fdcf-42a9-abd2-35b296cd53b5", 00:18:11.815 "strip_size_kb": 64, 00:18:11.815 "state": "online", 00:18:11.815 "raid_level": "raid0", 00:18:11.815 "superblock": true, 00:18:11.815 "num_base_bdevs": 4, 00:18:11.815 "num_base_bdevs_discovered": 4, 00:18:11.815 "num_base_bdevs_operational": 4, 00:18:11.815 "base_bdevs_list": [ 00:18:11.815 { 00:18:11.815 "name": "BaseBdev1", 00:18:11.815 "uuid": "25ddbed6-d51b-4cb1-9580-6f78784ac2cd", 00:18:11.815 "is_configured": true, 00:18:11.815 "data_offset": 2048, 00:18:11.815 "data_size": 63488 00:18:11.815 }, 00:18:11.815 { 00:18:11.815 "name": "BaseBdev2", 00:18:11.815 "uuid": "1fb90800-1db2-48ff-ab6f-45d5c97152ae", 00:18:11.815 "is_configured": true, 00:18:11.815 "data_offset": 2048, 00:18:11.815 "data_size": 63488 00:18:11.815 }, 00:18:11.815 { 00:18:11.815 "name": "BaseBdev3", 00:18:11.815 "uuid": "c2bebadb-e160-409d-95ba-96f3851dca03", 00:18:11.815 "is_configured": true, 00:18:11.815 "data_offset": 2048, 00:18:11.815 "data_size": 63488 00:18:11.815 }, 00:18:11.815 { 00:18:11.815 "name": "BaseBdev4", 00:18:11.815 "uuid": "4cef8db0-80aa-4417-b36b-bb4619daa656", 00:18:11.815 "is_configured": true, 00:18:11.815 "data_offset": 2048, 00:18:11.815 "data_size": 63488 00:18:11.815 } 00:18:11.815 ] 00:18:11.815 }' 00:18:11.815 10:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.815 10:44:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:12.383 [2024-07-12 10:44:47.484821] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:12.383 "name": "Existed_Raid", 00:18:12.383 "aliases": [ 00:18:12.383 "e07df5a4-fdcf-42a9-abd2-35b296cd53b5" 00:18:12.383 ], 00:18:12.383 "product_name": "Raid Volume", 00:18:12.383 "block_size": 512, 00:18:12.383 "num_blocks": 253952, 00:18:12.383 "uuid": "e07df5a4-fdcf-42a9-abd2-35b296cd53b5", 00:18:12.383 "assigned_rate_limits": { 00:18:12.383 "rw_ios_per_sec": 0, 00:18:12.383 "rw_mbytes_per_sec": 0, 00:18:12.383 "r_mbytes_per_sec": 0, 00:18:12.383 "w_mbytes_per_sec": 0 00:18:12.383 }, 00:18:12.383 "claimed": false, 00:18:12.383 "zoned": false, 00:18:12.383 "supported_io_types": { 00:18:12.383 "read": true, 00:18:12.383 "write": true, 00:18:12.383 "unmap": true, 00:18:12.383 "flush": true, 00:18:12.383 "reset": true, 00:18:12.383 "nvme_admin": false, 00:18:12.383 "nvme_io": false, 00:18:12.383 "nvme_io_md": false, 00:18:12.383 "write_zeroes": true, 00:18:12.383 "zcopy": false, 00:18:12.383 "get_zone_info": false, 00:18:12.383 "zone_management": false, 00:18:12.383 "zone_append": false, 00:18:12.383 "compare": false, 00:18:12.383 "compare_and_write": false, 00:18:12.383 "abort": false, 00:18:12.383 "seek_hole": false, 00:18:12.383 "seek_data": false, 00:18:12.383 "copy": false, 00:18:12.383 "nvme_iov_md": false 00:18:12.383 }, 00:18:12.383 "memory_domains": [ 00:18:12.383 { 00:18:12.383 "dma_device_id": "system", 00:18:12.383 "dma_device_type": 1 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.383 "dma_device_type": 2 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "dma_device_id": "system", 00:18:12.383 "dma_device_type": 1 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.383 "dma_device_type": 2 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "dma_device_id": "system", 00:18:12.383 "dma_device_type": 1 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.383 "dma_device_type": 2 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "dma_device_id": "system", 00:18:12.383 "dma_device_type": 1 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.383 "dma_device_type": 2 00:18:12.383 } 00:18:12.383 ], 00:18:12.383 "driver_specific": { 00:18:12.383 "raid": { 00:18:12.383 "uuid": "e07df5a4-fdcf-42a9-abd2-35b296cd53b5", 00:18:12.383 "strip_size_kb": 64, 00:18:12.383 "state": "online", 00:18:12.383 "raid_level": "raid0", 00:18:12.383 "superblock": true, 00:18:12.383 "num_base_bdevs": 4, 00:18:12.383 "num_base_bdevs_discovered": 4, 00:18:12.383 "num_base_bdevs_operational": 4, 00:18:12.383 "base_bdevs_list": [ 00:18:12.383 { 00:18:12.383 "name": "BaseBdev1", 00:18:12.383 "uuid": "25ddbed6-d51b-4cb1-9580-6f78784ac2cd", 00:18:12.383 "is_configured": true, 00:18:12.383 "data_offset": 2048, 00:18:12.383 "data_size": 63488 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "name": "BaseBdev2", 00:18:12.383 "uuid": "1fb90800-1db2-48ff-ab6f-45d5c97152ae", 00:18:12.383 "is_configured": true, 00:18:12.383 "data_offset": 2048, 00:18:12.383 "data_size": 63488 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "name": "BaseBdev3", 00:18:12.383 "uuid": "c2bebadb-e160-409d-95ba-96f3851dca03", 00:18:12.383 "is_configured": true, 00:18:12.383 "data_offset": 2048, 00:18:12.383 "data_size": 63488 00:18:12.383 }, 00:18:12.383 { 00:18:12.383 "name": "BaseBdev4", 00:18:12.383 "uuid": "4cef8db0-80aa-4417-b36b-bb4619daa656", 00:18:12.383 "is_configured": true, 00:18:12.383 "data_offset": 2048, 00:18:12.383 "data_size": 63488 00:18:12.383 } 00:18:12.383 ] 00:18:12.383 } 00:18:12.383 } 00:18:12.383 }' 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:12.383 BaseBdev2 00:18:12.383 BaseBdev3 00:18:12.383 BaseBdev4' 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:12.383 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:12.642 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:12.642 "name": "BaseBdev1", 00:18:12.642 "aliases": [ 00:18:12.642 "25ddbed6-d51b-4cb1-9580-6f78784ac2cd" 00:18:12.642 ], 00:18:12.642 "product_name": "Malloc disk", 00:18:12.642 "block_size": 512, 00:18:12.642 "num_blocks": 65536, 00:18:12.642 "uuid": "25ddbed6-d51b-4cb1-9580-6f78784ac2cd", 00:18:12.642 "assigned_rate_limits": { 00:18:12.642 "rw_ios_per_sec": 0, 00:18:12.642 "rw_mbytes_per_sec": 0, 00:18:12.642 "r_mbytes_per_sec": 0, 00:18:12.642 "w_mbytes_per_sec": 0 00:18:12.642 }, 00:18:12.642 "claimed": true, 00:18:12.642 "claim_type": "exclusive_write", 00:18:12.642 "zoned": false, 00:18:12.642 "supported_io_types": { 00:18:12.642 "read": true, 00:18:12.642 "write": true, 00:18:12.642 "unmap": true, 00:18:12.642 "flush": true, 00:18:12.642 "reset": true, 00:18:12.642 "nvme_admin": false, 00:18:12.642 "nvme_io": false, 00:18:12.642 "nvme_io_md": false, 00:18:12.642 "write_zeroes": true, 00:18:12.642 "zcopy": true, 00:18:12.642 "get_zone_info": false, 00:18:12.642 "zone_management": false, 00:18:12.642 "zone_append": false, 00:18:12.642 "compare": false, 00:18:12.642 "compare_and_write": false, 00:18:12.642 "abort": true, 00:18:12.642 "seek_hole": false, 00:18:12.642 "seek_data": false, 00:18:12.642 "copy": true, 00:18:12.642 "nvme_iov_md": false 00:18:12.642 }, 00:18:12.642 "memory_domains": [ 00:18:12.642 { 00:18:12.642 "dma_device_id": "system", 00:18:12.642 "dma_device_type": 1 00:18:12.642 }, 00:18:12.642 { 00:18:12.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.642 "dma_device_type": 2 00:18:12.642 } 00:18:12.642 ], 00:18:12.642 "driver_specific": {} 00:18:12.642 }' 00:18:12.642 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:12.900 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:12.900 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:12.900 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:12.900 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:12.900 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:12.900 10:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:12.900 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:12.900 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:12.900 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:13.159 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:13.159 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:13.159 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:13.159 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:13.159 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:13.418 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:13.418 "name": "BaseBdev2", 00:18:13.418 "aliases": [ 00:18:13.418 "1fb90800-1db2-48ff-ab6f-45d5c97152ae" 00:18:13.418 ], 00:18:13.418 "product_name": "Malloc disk", 00:18:13.418 "block_size": 512, 00:18:13.418 "num_blocks": 65536, 00:18:13.418 "uuid": "1fb90800-1db2-48ff-ab6f-45d5c97152ae", 00:18:13.418 "assigned_rate_limits": { 00:18:13.418 "rw_ios_per_sec": 0, 00:18:13.418 "rw_mbytes_per_sec": 0, 00:18:13.418 "r_mbytes_per_sec": 0, 00:18:13.418 "w_mbytes_per_sec": 0 00:18:13.418 }, 00:18:13.418 "claimed": true, 00:18:13.418 "claim_type": "exclusive_write", 00:18:13.418 "zoned": false, 00:18:13.418 "supported_io_types": { 00:18:13.418 "read": true, 00:18:13.418 "write": true, 00:18:13.418 "unmap": true, 00:18:13.418 "flush": true, 00:18:13.418 "reset": true, 00:18:13.418 "nvme_admin": false, 00:18:13.418 "nvme_io": false, 00:18:13.418 "nvme_io_md": false, 00:18:13.418 "write_zeroes": true, 00:18:13.418 "zcopy": true, 00:18:13.418 "get_zone_info": false, 00:18:13.418 "zone_management": false, 00:18:13.418 "zone_append": false, 00:18:13.418 "compare": false, 00:18:13.418 "compare_and_write": false, 00:18:13.418 "abort": true, 00:18:13.418 "seek_hole": false, 00:18:13.418 "seek_data": false, 00:18:13.418 "copy": true, 00:18:13.418 "nvme_iov_md": false 00:18:13.418 }, 00:18:13.418 "memory_domains": [ 00:18:13.418 { 00:18:13.418 "dma_device_id": "system", 00:18:13.418 "dma_device_type": 1 00:18:13.418 }, 00:18:13.418 { 00:18:13.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.418 "dma_device_type": 2 00:18:13.418 } 00:18:13.418 ], 00:18:13.418 "driver_specific": {} 00:18:13.418 }' 00:18:13.418 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:13.418 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:13.418 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:13.418 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:13.418 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:13.418 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:13.418 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:13.418 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:13.677 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:13.677 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:13.677 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:13.677 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:13.677 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:13.677 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:13.677 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:13.936 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:13.936 "name": "BaseBdev3", 00:18:13.936 "aliases": [ 00:18:13.936 "c2bebadb-e160-409d-95ba-96f3851dca03" 00:18:13.936 ], 00:18:13.936 "product_name": "Malloc disk", 00:18:13.936 "block_size": 512, 00:18:13.936 "num_blocks": 65536, 00:18:13.936 "uuid": "c2bebadb-e160-409d-95ba-96f3851dca03", 00:18:13.936 "assigned_rate_limits": { 00:18:13.936 "rw_ios_per_sec": 0, 00:18:13.936 "rw_mbytes_per_sec": 0, 00:18:13.936 "r_mbytes_per_sec": 0, 00:18:13.936 "w_mbytes_per_sec": 0 00:18:13.936 }, 00:18:13.936 "claimed": true, 00:18:13.936 "claim_type": "exclusive_write", 00:18:13.936 "zoned": false, 00:18:13.936 "supported_io_types": { 00:18:13.936 "read": true, 00:18:13.936 "write": true, 00:18:13.936 "unmap": true, 00:18:13.936 "flush": true, 00:18:13.936 "reset": true, 00:18:13.936 "nvme_admin": false, 00:18:13.936 "nvme_io": false, 00:18:13.936 "nvme_io_md": false, 00:18:13.936 "write_zeroes": true, 00:18:13.936 "zcopy": true, 00:18:13.936 "get_zone_info": false, 00:18:13.936 "zone_management": false, 00:18:13.936 "zone_append": false, 00:18:13.936 "compare": false, 00:18:13.936 "compare_and_write": false, 00:18:13.936 "abort": true, 00:18:13.936 "seek_hole": false, 00:18:13.936 "seek_data": false, 00:18:13.936 "copy": true, 00:18:13.936 "nvme_iov_md": false 00:18:13.936 }, 00:18:13.936 "memory_domains": [ 00:18:13.936 { 00:18:13.936 "dma_device_id": "system", 00:18:13.936 "dma_device_type": 1 00:18:13.936 }, 00:18:13.936 { 00:18:13.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.936 "dma_device_type": 2 00:18:13.936 } 00:18:13.936 ], 00:18:13.936 "driver_specific": {} 00:18:13.936 }' 00:18:13.936 10:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:13.936 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:13.936 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:13.936 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:13.936 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:14.195 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:14.454 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:14.454 "name": "BaseBdev4", 00:18:14.454 "aliases": [ 00:18:14.454 "4cef8db0-80aa-4417-b36b-bb4619daa656" 00:18:14.454 ], 00:18:14.454 "product_name": "Malloc disk", 00:18:14.454 "block_size": 512, 00:18:14.454 "num_blocks": 65536, 00:18:14.454 "uuid": "4cef8db0-80aa-4417-b36b-bb4619daa656", 00:18:14.454 "assigned_rate_limits": { 00:18:14.454 "rw_ios_per_sec": 0, 00:18:14.454 "rw_mbytes_per_sec": 0, 00:18:14.454 "r_mbytes_per_sec": 0, 00:18:14.454 "w_mbytes_per_sec": 0 00:18:14.454 }, 00:18:14.454 "claimed": true, 00:18:14.454 "claim_type": "exclusive_write", 00:18:14.454 "zoned": false, 00:18:14.454 "supported_io_types": { 00:18:14.454 "read": true, 00:18:14.454 "write": true, 00:18:14.454 "unmap": true, 00:18:14.454 "flush": true, 00:18:14.454 "reset": true, 00:18:14.454 "nvme_admin": false, 00:18:14.454 "nvme_io": false, 00:18:14.454 "nvme_io_md": false, 00:18:14.454 "write_zeroes": true, 00:18:14.454 "zcopy": true, 00:18:14.454 "get_zone_info": false, 00:18:14.454 "zone_management": false, 00:18:14.454 "zone_append": false, 00:18:14.454 "compare": false, 00:18:14.454 "compare_and_write": false, 00:18:14.454 "abort": true, 00:18:14.454 "seek_hole": false, 00:18:14.454 "seek_data": false, 00:18:14.454 "copy": true, 00:18:14.454 "nvme_iov_md": false 00:18:14.454 }, 00:18:14.454 "memory_domains": [ 00:18:14.454 { 00:18:14.454 "dma_device_id": "system", 00:18:14.454 "dma_device_type": 1 00:18:14.454 }, 00:18:14.454 { 00:18:14.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.454 "dma_device_type": 2 00:18:14.454 } 00:18:14.454 ], 00:18:14.454 "driver_specific": {} 00:18:14.454 }' 00:18:14.454 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.454 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.454 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:14.454 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.454 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.713 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:14.713 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.713 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.713 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:14.713 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.713 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.713 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:14.713 10:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:14.972 [2024-07-12 10:44:50.079413] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:14.972 [2024-07-12 10:44:50.079445] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:14.972 [2024-07-12 10:44:50.079501] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.972 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:15.231 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.231 "name": "Existed_Raid", 00:18:15.231 "uuid": "e07df5a4-fdcf-42a9-abd2-35b296cd53b5", 00:18:15.231 "strip_size_kb": 64, 00:18:15.231 "state": "offline", 00:18:15.231 "raid_level": "raid0", 00:18:15.231 "superblock": true, 00:18:15.231 "num_base_bdevs": 4, 00:18:15.231 "num_base_bdevs_discovered": 3, 00:18:15.231 "num_base_bdevs_operational": 3, 00:18:15.231 "base_bdevs_list": [ 00:18:15.231 { 00:18:15.231 "name": null, 00:18:15.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.231 "is_configured": false, 00:18:15.231 "data_offset": 2048, 00:18:15.231 "data_size": 63488 00:18:15.231 }, 00:18:15.231 { 00:18:15.231 "name": "BaseBdev2", 00:18:15.231 "uuid": "1fb90800-1db2-48ff-ab6f-45d5c97152ae", 00:18:15.231 "is_configured": true, 00:18:15.231 "data_offset": 2048, 00:18:15.231 "data_size": 63488 00:18:15.231 }, 00:18:15.231 { 00:18:15.231 "name": "BaseBdev3", 00:18:15.231 "uuid": "c2bebadb-e160-409d-95ba-96f3851dca03", 00:18:15.231 "is_configured": true, 00:18:15.231 "data_offset": 2048, 00:18:15.231 "data_size": 63488 00:18:15.231 }, 00:18:15.231 { 00:18:15.231 "name": "BaseBdev4", 00:18:15.231 "uuid": "4cef8db0-80aa-4417-b36b-bb4619daa656", 00:18:15.231 "is_configured": true, 00:18:15.231 "data_offset": 2048, 00:18:15.231 "data_size": 63488 00:18:15.231 } 00:18:15.231 ] 00:18:15.231 }' 00:18:15.231 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.231 10:44:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:15.799 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:15.799 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:15.799 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.799 10:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:16.058 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:16.058 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:16.058 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:16.317 [2024-07-12 10:44:51.424289] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:16.317 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:16.317 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:16.317 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.317 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:16.576 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:16.576 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:16.576 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:16.834 [2024-07-12 10:44:51.924215] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:16.834 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:16.834 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:16.834 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.834 10:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:17.093 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:17.093 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:17.093 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:17.352 [2024-07-12 10:44:52.425811] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:17.352 [2024-07-12 10:44:52.425857] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfdd350 name Existed_Raid, state offline 00:18:17.352 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:17.352 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:17.352 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.352 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:17.610 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:17.610 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:17.610 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:17.610 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:17.610 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:17.610 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:17.869 BaseBdev2 00:18:17.869 10:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:17.869 10:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:17.869 10:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:17.869 10:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:17.869 10:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:17.869 10:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:17.869 10:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:18.127 10:44:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:18.384 [ 00:18:18.384 { 00:18:18.384 "name": "BaseBdev2", 00:18:18.384 "aliases": [ 00:18:18.384 "d599d874-2efa-4c8f-946d-dbc863bf0291" 00:18:18.384 ], 00:18:18.384 "product_name": "Malloc disk", 00:18:18.384 "block_size": 512, 00:18:18.384 "num_blocks": 65536, 00:18:18.384 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:18.384 "assigned_rate_limits": { 00:18:18.384 "rw_ios_per_sec": 0, 00:18:18.384 "rw_mbytes_per_sec": 0, 00:18:18.384 "r_mbytes_per_sec": 0, 00:18:18.384 "w_mbytes_per_sec": 0 00:18:18.384 }, 00:18:18.384 "claimed": false, 00:18:18.384 "zoned": false, 00:18:18.384 "supported_io_types": { 00:18:18.384 "read": true, 00:18:18.384 "write": true, 00:18:18.384 "unmap": true, 00:18:18.384 "flush": true, 00:18:18.384 "reset": true, 00:18:18.384 "nvme_admin": false, 00:18:18.384 "nvme_io": false, 00:18:18.384 "nvme_io_md": false, 00:18:18.384 "write_zeroes": true, 00:18:18.384 "zcopy": true, 00:18:18.384 "get_zone_info": false, 00:18:18.384 "zone_management": false, 00:18:18.384 "zone_append": false, 00:18:18.384 "compare": false, 00:18:18.384 "compare_and_write": false, 00:18:18.384 "abort": true, 00:18:18.384 "seek_hole": false, 00:18:18.384 "seek_data": false, 00:18:18.384 "copy": true, 00:18:18.384 "nvme_iov_md": false 00:18:18.384 }, 00:18:18.384 "memory_domains": [ 00:18:18.384 { 00:18:18.384 "dma_device_id": "system", 00:18:18.384 "dma_device_type": 1 00:18:18.384 }, 00:18:18.384 { 00:18:18.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.384 "dma_device_type": 2 00:18:18.384 } 00:18:18.384 ], 00:18:18.384 "driver_specific": {} 00:18:18.384 } 00:18:18.384 ] 00:18:18.384 10:44:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:18.384 10:44:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:18.384 10:44:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:18.384 10:44:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:18.643 BaseBdev3 00:18:18.643 10:44:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:18.643 10:44:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:18.643 10:44:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:18.643 10:44:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:18.643 10:44:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:18.643 10:44:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:18.643 10:44:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:18.901 10:44:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:18.901 [ 00:18:18.901 { 00:18:18.901 "name": "BaseBdev3", 00:18:18.901 "aliases": [ 00:18:18.901 "b711a99a-2e17-45f5-b57d-7ca1a02ccffb" 00:18:18.901 ], 00:18:18.901 "product_name": "Malloc disk", 00:18:18.901 "block_size": 512, 00:18:18.901 "num_blocks": 65536, 00:18:18.901 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:18.901 "assigned_rate_limits": { 00:18:18.901 "rw_ios_per_sec": 0, 00:18:18.901 "rw_mbytes_per_sec": 0, 00:18:18.901 "r_mbytes_per_sec": 0, 00:18:18.901 "w_mbytes_per_sec": 0 00:18:18.901 }, 00:18:18.901 "claimed": false, 00:18:18.901 "zoned": false, 00:18:18.901 "supported_io_types": { 00:18:18.901 "read": true, 00:18:18.901 "write": true, 00:18:18.901 "unmap": true, 00:18:18.901 "flush": true, 00:18:18.901 "reset": true, 00:18:18.901 "nvme_admin": false, 00:18:18.901 "nvme_io": false, 00:18:18.901 "nvme_io_md": false, 00:18:18.901 "write_zeroes": true, 00:18:18.901 "zcopy": true, 00:18:18.901 "get_zone_info": false, 00:18:18.901 "zone_management": false, 00:18:18.901 "zone_append": false, 00:18:18.901 "compare": false, 00:18:18.901 "compare_and_write": false, 00:18:18.901 "abort": true, 00:18:18.901 "seek_hole": false, 00:18:18.901 "seek_data": false, 00:18:18.901 "copy": true, 00:18:18.901 "nvme_iov_md": false 00:18:18.901 }, 00:18:18.901 "memory_domains": [ 00:18:18.901 { 00:18:18.901 "dma_device_id": "system", 00:18:18.901 "dma_device_type": 1 00:18:18.901 }, 00:18:18.901 { 00:18:18.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.901 "dma_device_type": 2 00:18:18.901 } 00:18:18.901 ], 00:18:18.901 "driver_specific": {} 00:18:18.901 } 00:18:18.901 ] 00:18:18.901 10:44:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:18.901 10:44:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:18.901 10:44:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:18.901 10:44:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:19.159 BaseBdev4 00:18:19.453 10:44:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:19.453 10:44:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:19.453 10:44:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:19.453 10:44:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:19.453 10:44:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:19.453 10:44:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:19.454 10:44:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:19.454 10:44:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:19.738 [ 00:18:19.738 { 00:18:19.738 "name": "BaseBdev4", 00:18:19.738 "aliases": [ 00:18:19.738 "65d2f885-9d55-4dff-bbca-5d65a5122ba4" 00:18:19.738 ], 00:18:19.738 "product_name": "Malloc disk", 00:18:19.738 "block_size": 512, 00:18:19.738 "num_blocks": 65536, 00:18:19.738 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:19.738 "assigned_rate_limits": { 00:18:19.738 "rw_ios_per_sec": 0, 00:18:19.738 "rw_mbytes_per_sec": 0, 00:18:19.738 "r_mbytes_per_sec": 0, 00:18:19.738 "w_mbytes_per_sec": 0 00:18:19.738 }, 00:18:19.738 "claimed": false, 00:18:19.738 "zoned": false, 00:18:19.738 "supported_io_types": { 00:18:19.738 "read": true, 00:18:19.738 "write": true, 00:18:19.738 "unmap": true, 00:18:19.738 "flush": true, 00:18:19.738 "reset": true, 00:18:19.738 "nvme_admin": false, 00:18:19.738 "nvme_io": false, 00:18:19.738 "nvme_io_md": false, 00:18:19.738 "write_zeroes": true, 00:18:19.738 "zcopy": true, 00:18:19.738 "get_zone_info": false, 00:18:19.738 "zone_management": false, 00:18:19.738 "zone_append": false, 00:18:19.738 "compare": false, 00:18:19.738 "compare_and_write": false, 00:18:19.738 "abort": true, 00:18:19.738 "seek_hole": false, 00:18:19.738 "seek_data": false, 00:18:19.738 "copy": true, 00:18:19.738 "nvme_iov_md": false 00:18:19.738 }, 00:18:19.738 "memory_domains": [ 00:18:19.738 { 00:18:19.738 "dma_device_id": "system", 00:18:19.738 "dma_device_type": 1 00:18:19.738 }, 00:18:19.738 { 00:18:19.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.738 "dma_device_type": 2 00:18:19.738 } 00:18:19.738 ], 00:18:19.738 "driver_specific": {} 00:18:19.738 } 00:18:19.738 ] 00:18:19.738 10:44:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:19.738 10:44:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:19.738 10:44:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:19.738 10:44:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:19.997 [2024-07-12 10:44:55.071848] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:19.997 [2024-07-12 10:44:55.071891] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:19.997 [2024-07-12 10:44:55.071910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:19.997 [2024-07-12 10:44:55.073217] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:19.997 [2024-07-12 10:44:55.073258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.997 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:20.256 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.256 "name": "Existed_Raid", 00:18:20.256 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:20.256 "strip_size_kb": 64, 00:18:20.256 "state": "configuring", 00:18:20.256 "raid_level": "raid0", 00:18:20.256 "superblock": true, 00:18:20.256 "num_base_bdevs": 4, 00:18:20.256 "num_base_bdevs_discovered": 3, 00:18:20.256 "num_base_bdevs_operational": 4, 00:18:20.256 "base_bdevs_list": [ 00:18:20.256 { 00:18:20.256 "name": "BaseBdev1", 00:18:20.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.256 "is_configured": false, 00:18:20.256 "data_offset": 0, 00:18:20.256 "data_size": 0 00:18:20.256 }, 00:18:20.256 { 00:18:20.256 "name": "BaseBdev2", 00:18:20.256 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:20.256 "is_configured": true, 00:18:20.256 "data_offset": 2048, 00:18:20.256 "data_size": 63488 00:18:20.256 }, 00:18:20.256 { 00:18:20.256 "name": "BaseBdev3", 00:18:20.256 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:20.256 "is_configured": true, 00:18:20.256 "data_offset": 2048, 00:18:20.256 "data_size": 63488 00:18:20.256 }, 00:18:20.256 { 00:18:20.256 "name": "BaseBdev4", 00:18:20.256 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:20.256 "is_configured": true, 00:18:20.256 "data_offset": 2048, 00:18:20.256 "data_size": 63488 00:18:20.256 } 00:18:20.256 ] 00:18:20.256 }' 00:18:20.256 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.256 10:44:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:20.824 10:44:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:21.082 [2024-07-12 10:44:56.114580] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:21.083 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.341 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:21.341 "name": "Existed_Raid", 00:18:21.341 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:21.341 "strip_size_kb": 64, 00:18:21.341 "state": "configuring", 00:18:21.341 "raid_level": "raid0", 00:18:21.341 "superblock": true, 00:18:21.341 "num_base_bdevs": 4, 00:18:21.341 "num_base_bdevs_discovered": 2, 00:18:21.341 "num_base_bdevs_operational": 4, 00:18:21.341 "base_bdevs_list": [ 00:18:21.341 { 00:18:21.341 "name": "BaseBdev1", 00:18:21.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.341 "is_configured": false, 00:18:21.341 "data_offset": 0, 00:18:21.341 "data_size": 0 00:18:21.341 }, 00:18:21.341 { 00:18:21.341 "name": null, 00:18:21.341 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:21.341 "is_configured": false, 00:18:21.341 "data_offset": 2048, 00:18:21.341 "data_size": 63488 00:18:21.341 }, 00:18:21.341 { 00:18:21.341 "name": "BaseBdev3", 00:18:21.341 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:21.341 "is_configured": true, 00:18:21.341 "data_offset": 2048, 00:18:21.341 "data_size": 63488 00:18:21.341 }, 00:18:21.341 { 00:18:21.341 "name": "BaseBdev4", 00:18:21.341 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:21.341 "is_configured": true, 00:18:21.341 "data_offset": 2048, 00:18:21.341 "data_size": 63488 00:18:21.341 } 00:18:21.341 ] 00:18:21.341 }' 00:18:21.341 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:21.341 10:44:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:21.907 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.907 10:44:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:22.166 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:22.166 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:22.425 [2024-07-12 10:44:57.394517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:22.425 BaseBdev1 00:18:22.425 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:22.425 10:44:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:22.425 10:44:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:22.425 10:44:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:22.425 10:44:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:22.425 10:44:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:22.425 10:44:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:22.684 10:44:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:22.684 [ 00:18:22.684 { 00:18:22.684 "name": "BaseBdev1", 00:18:22.684 "aliases": [ 00:18:22.684 "1b0d986c-3c69-4a40-8460-88cfd05df3b9" 00:18:22.684 ], 00:18:22.684 "product_name": "Malloc disk", 00:18:22.684 "block_size": 512, 00:18:22.684 "num_blocks": 65536, 00:18:22.684 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:22.684 "assigned_rate_limits": { 00:18:22.684 "rw_ios_per_sec": 0, 00:18:22.684 "rw_mbytes_per_sec": 0, 00:18:22.684 "r_mbytes_per_sec": 0, 00:18:22.684 "w_mbytes_per_sec": 0 00:18:22.684 }, 00:18:22.684 "claimed": true, 00:18:22.684 "claim_type": "exclusive_write", 00:18:22.684 "zoned": false, 00:18:22.684 "supported_io_types": { 00:18:22.684 "read": true, 00:18:22.684 "write": true, 00:18:22.684 "unmap": true, 00:18:22.684 "flush": true, 00:18:22.684 "reset": true, 00:18:22.684 "nvme_admin": false, 00:18:22.684 "nvme_io": false, 00:18:22.684 "nvme_io_md": false, 00:18:22.684 "write_zeroes": true, 00:18:22.684 "zcopy": true, 00:18:22.684 "get_zone_info": false, 00:18:22.684 "zone_management": false, 00:18:22.684 "zone_append": false, 00:18:22.684 "compare": false, 00:18:22.684 "compare_and_write": false, 00:18:22.684 "abort": true, 00:18:22.684 "seek_hole": false, 00:18:22.684 "seek_data": false, 00:18:22.684 "copy": true, 00:18:22.684 "nvme_iov_md": false 00:18:22.684 }, 00:18:22.684 "memory_domains": [ 00:18:22.684 { 00:18:22.684 "dma_device_id": "system", 00:18:22.684 "dma_device_type": 1 00:18:22.684 }, 00:18:22.684 { 00:18:22.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.684 "dma_device_type": 2 00:18:22.684 } 00:18:22.684 ], 00:18:22.684 "driver_specific": {} 00:18:22.684 } 00:18:22.684 ] 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.942 10:44:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:22.942 10:44:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.942 "name": "Existed_Raid", 00:18:22.942 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:22.942 "strip_size_kb": 64, 00:18:22.942 "state": "configuring", 00:18:22.942 "raid_level": "raid0", 00:18:22.942 "superblock": true, 00:18:22.942 "num_base_bdevs": 4, 00:18:22.942 "num_base_bdevs_discovered": 3, 00:18:22.942 "num_base_bdevs_operational": 4, 00:18:22.942 "base_bdevs_list": [ 00:18:22.942 { 00:18:22.942 "name": "BaseBdev1", 00:18:22.942 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:22.942 "is_configured": true, 00:18:22.942 "data_offset": 2048, 00:18:22.942 "data_size": 63488 00:18:22.942 }, 00:18:22.942 { 00:18:22.942 "name": null, 00:18:22.942 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:22.942 "is_configured": false, 00:18:22.942 "data_offset": 2048, 00:18:22.942 "data_size": 63488 00:18:22.942 }, 00:18:22.942 { 00:18:22.942 "name": "BaseBdev3", 00:18:22.942 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:22.942 "is_configured": true, 00:18:22.942 "data_offset": 2048, 00:18:22.942 "data_size": 63488 00:18:22.942 }, 00:18:22.942 { 00:18:22.942 "name": "BaseBdev4", 00:18:22.942 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:22.942 "is_configured": true, 00:18:22.942 "data_offset": 2048, 00:18:22.942 "data_size": 63488 00:18:22.942 } 00:18:22.942 ] 00:18:22.942 }' 00:18:22.942 10:44:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.942 10:44:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:23.875 10:44:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.875 10:44:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:23.875 10:44:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:23.875 10:44:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:24.134 [2024-07-12 10:44:59.115082] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.134 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:24.393 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.393 "name": "Existed_Raid", 00:18:24.393 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:24.393 "strip_size_kb": 64, 00:18:24.393 "state": "configuring", 00:18:24.393 "raid_level": "raid0", 00:18:24.393 "superblock": true, 00:18:24.393 "num_base_bdevs": 4, 00:18:24.393 "num_base_bdevs_discovered": 2, 00:18:24.393 "num_base_bdevs_operational": 4, 00:18:24.393 "base_bdevs_list": [ 00:18:24.393 { 00:18:24.393 "name": "BaseBdev1", 00:18:24.393 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:24.393 "is_configured": true, 00:18:24.393 "data_offset": 2048, 00:18:24.393 "data_size": 63488 00:18:24.393 }, 00:18:24.393 { 00:18:24.393 "name": null, 00:18:24.393 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:24.393 "is_configured": false, 00:18:24.393 "data_offset": 2048, 00:18:24.393 "data_size": 63488 00:18:24.393 }, 00:18:24.393 { 00:18:24.393 "name": null, 00:18:24.393 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:24.393 "is_configured": false, 00:18:24.393 "data_offset": 2048, 00:18:24.393 "data_size": 63488 00:18:24.393 }, 00:18:24.393 { 00:18:24.393 "name": "BaseBdev4", 00:18:24.393 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:24.393 "is_configured": true, 00:18:24.393 "data_offset": 2048, 00:18:24.393 "data_size": 63488 00:18:24.393 } 00:18:24.393 ] 00:18:24.393 }' 00:18:24.393 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.393 10:44:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:24.959 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.959 10:44:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:25.217 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:25.217 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:25.475 [2024-07-12 10:45:00.458674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.475 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.733 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.733 "name": "Existed_Raid", 00:18:25.733 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:25.733 "strip_size_kb": 64, 00:18:25.733 "state": "configuring", 00:18:25.733 "raid_level": "raid0", 00:18:25.733 "superblock": true, 00:18:25.733 "num_base_bdevs": 4, 00:18:25.733 "num_base_bdevs_discovered": 3, 00:18:25.733 "num_base_bdevs_operational": 4, 00:18:25.733 "base_bdevs_list": [ 00:18:25.733 { 00:18:25.733 "name": "BaseBdev1", 00:18:25.733 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:25.733 "is_configured": true, 00:18:25.733 "data_offset": 2048, 00:18:25.733 "data_size": 63488 00:18:25.733 }, 00:18:25.733 { 00:18:25.733 "name": null, 00:18:25.733 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:25.733 "is_configured": false, 00:18:25.733 "data_offset": 2048, 00:18:25.733 "data_size": 63488 00:18:25.733 }, 00:18:25.733 { 00:18:25.733 "name": "BaseBdev3", 00:18:25.733 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:25.733 "is_configured": true, 00:18:25.733 "data_offset": 2048, 00:18:25.733 "data_size": 63488 00:18:25.733 }, 00:18:25.733 { 00:18:25.733 "name": "BaseBdev4", 00:18:25.733 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:25.733 "is_configured": true, 00:18:25.733 "data_offset": 2048, 00:18:25.733 "data_size": 63488 00:18:25.733 } 00:18:25.733 ] 00:18:25.733 }' 00:18:25.733 10:45:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.733 10:45:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:26.299 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.299 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:26.299 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:26.299 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:26.557 [2024-07-12 10:45:01.681923] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.557 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.816 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.816 "name": "Existed_Raid", 00:18:26.816 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:26.816 "strip_size_kb": 64, 00:18:26.816 "state": "configuring", 00:18:26.816 "raid_level": "raid0", 00:18:26.816 "superblock": true, 00:18:26.816 "num_base_bdevs": 4, 00:18:26.816 "num_base_bdevs_discovered": 2, 00:18:26.816 "num_base_bdevs_operational": 4, 00:18:26.816 "base_bdevs_list": [ 00:18:26.816 { 00:18:26.816 "name": null, 00:18:26.816 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:26.816 "is_configured": false, 00:18:26.816 "data_offset": 2048, 00:18:26.816 "data_size": 63488 00:18:26.816 }, 00:18:26.816 { 00:18:26.816 "name": null, 00:18:26.816 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:26.816 "is_configured": false, 00:18:26.816 "data_offset": 2048, 00:18:26.816 "data_size": 63488 00:18:26.816 }, 00:18:26.816 { 00:18:26.816 "name": "BaseBdev3", 00:18:26.816 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:26.816 "is_configured": true, 00:18:26.816 "data_offset": 2048, 00:18:26.816 "data_size": 63488 00:18:26.816 }, 00:18:26.816 { 00:18:26.816 "name": "BaseBdev4", 00:18:26.816 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:26.816 "is_configured": true, 00:18:26.816 "data_offset": 2048, 00:18:26.816 "data_size": 63488 00:18:26.816 } 00:18:26.816 ] 00:18:26.816 }' 00:18:26.816 10:45:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.816 10:45:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.750 10:45:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.750 10:45:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:27.750 10:45:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:27.750 10:45:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:28.008 [2024-07-12 10:45:03.164409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:28.008 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.266 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.266 "name": "Existed_Raid", 00:18:28.266 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:28.266 "strip_size_kb": 64, 00:18:28.266 "state": "configuring", 00:18:28.266 "raid_level": "raid0", 00:18:28.266 "superblock": true, 00:18:28.266 "num_base_bdevs": 4, 00:18:28.266 "num_base_bdevs_discovered": 3, 00:18:28.266 "num_base_bdevs_operational": 4, 00:18:28.266 "base_bdevs_list": [ 00:18:28.266 { 00:18:28.266 "name": null, 00:18:28.266 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:28.266 "is_configured": false, 00:18:28.266 "data_offset": 2048, 00:18:28.266 "data_size": 63488 00:18:28.266 }, 00:18:28.266 { 00:18:28.266 "name": "BaseBdev2", 00:18:28.266 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:28.266 "is_configured": true, 00:18:28.266 "data_offset": 2048, 00:18:28.266 "data_size": 63488 00:18:28.266 }, 00:18:28.266 { 00:18:28.266 "name": "BaseBdev3", 00:18:28.266 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:28.266 "is_configured": true, 00:18:28.266 "data_offset": 2048, 00:18:28.266 "data_size": 63488 00:18:28.266 }, 00:18:28.266 { 00:18:28.266 "name": "BaseBdev4", 00:18:28.266 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:28.266 "is_configured": true, 00:18:28.266 "data_offset": 2048, 00:18:28.266 "data_size": 63488 00:18:28.266 } 00:18:28.266 ] 00:18:28.266 }' 00:18:28.266 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.266 10:45:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:28.832 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.832 10:45:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:29.091 10:45:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:29.091 10:45:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.091 10:45:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:29.350 10:45:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1b0d986c-3c69-4a40-8460-88cfd05df3b9 00:18:29.918 [2024-07-12 10:45:04.916546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:29.918 [2024-07-12 10:45:04.916703] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfe3470 00:18:29.918 [2024-07-12 10:45:04.916716] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:29.918 [2024-07-12 10:45:04.916889] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfd3c40 00:18:29.918 [2024-07-12 10:45:04.917003] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfe3470 00:18:29.918 [2024-07-12 10:45:04.917013] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfe3470 00:18:29.919 [2024-07-12 10:45:04.917102] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:29.919 NewBaseBdev 00:18:29.919 10:45:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:29.919 10:45:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:29.919 10:45:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:29.919 10:45:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:29.919 10:45:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:29.919 10:45:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:29.919 10:45:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:30.178 10:45:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:30.178 [ 00:18:30.178 { 00:18:30.178 "name": "NewBaseBdev", 00:18:30.178 "aliases": [ 00:18:30.178 "1b0d986c-3c69-4a40-8460-88cfd05df3b9" 00:18:30.178 ], 00:18:30.178 "product_name": "Malloc disk", 00:18:30.178 "block_size": 512, 00:18:30.178 "num_blocks": 65536, 00:18:30.178 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:30.178 "assigned_rate_limits": { 00:18:30.178 "rw_ios_per_sec": 0, 00:18:30.178 "rw_mbytes_per_sec": 0, 00:18:30.178 "r_mbytes_per_sec": 0, 00:18:30.178 "w_mbytes_per_sec": 0 00:18:30.178 }, 00:18:30.178 "claimed": true, 00:18:30.178 "claim_type": "exclusive_write", 00:18:30.178 "zoned": false, 00:18:30.178 "supported_io_types": { 00:18:30.178 "read": true, 00:18:30.178 "write": true, 00:18:30.178 "unmap": true, 00:18:30.178 "flush": true, 00:18:30.178 "reset": true, 00:18:30.178 "nvme_admin": false, 00:18:30.178 "nvme_io": false, 00:18:30.178 "nvme_io_md": false, 00:18:30.178 "write_zeroes": true, 00:18:30.178 "zcopy": true, 00:18:30.178 "get_zone_info": false, 00:18:30.178 "zone_management": false, 00:18:30.178 "zone_append": false, 00:18:30.178 "compare": false, 00:18:30.178 "compare_and_write": false, 00:18:30.179 "abort": true, 00:18:30.179 "seek_hole": false, 00:18:30.179 "seek_data": false, 00:18:30.179 "copy": true, 00:18:30.179 "nvme_iov_md": false 00:18:30.179 }, 00:18:30.179 "memory_domains": [ 00:18:30.179 { 00:18:30.179 "dma_device_id": "system", 00:18:30.179 "dma_device_type": 1 00:18:30.179 }, 00:18:30.179 { 00:18:30.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.179 "dma_device_type": 2 00:18:30.179 } 00:18:30.179 ], 00:18:30.179 "driver_specific": {} 00:18:30.179 } 00:18:30.179 ] 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.179 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.438 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.438 "name": "Existed_Raid", 00:18:30.438 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:30.438 "strip_size_kb": 64, 00:18:30.438 "state": "online", 00:18:30.438 "raid_level": "raid0", 00:18:30.438 "superblock": true, 00:18:30.439 "num_base_bdevs": 4, 00:18:30.439 "num_base_bdevs_discovered": 4, 00:18:30.439 "num_base_bdevs_operational": 4, 00:18:30.439 "base_bdevs_list": [ 00:18:30.439 { 00:18:30.439 "name": "NewBaseBdev", 00:18:30.439 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:30.439 "is_configured": true, 00:18:30.439 "data_offset": 2048, 00:18:30.439 "data_size": 63488 00:18:30.439 }, 00:18:30.439 { 00:18:30.439 "name": "BaseBdev2", 00:18:30.439 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:30.439 "is_configured": true, 00:18:30.439 "data_offset": 2048, 00:18:30.439 "data_size": 63488 00:18:30.439 }, 00:18:30.439 { 00:18:30.439 "name": "BaseBdev3", 00:18:30.439 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:30.439 "is_configured": true, 00:18:30.439 "data_offset": 2048, 00:18:30.439 "data_size": 63488 00:18:30.439 }, 00:18:30.439 { 00:18:30.439 "name": "BaseBdev4", 00:18:30.439 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:30.439 "is_configured": true, 00:18:30.439 "data_offset": 2048, 00:18:30.439 "data_size": 63488 00:18:30.439 } 00:18:30.439 ] 00:18:30.439 }' 00:18:30.439 10:45:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.439 10:45:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:31.006 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:31.006 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:31.006 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:31.006 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:31.006 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:31.006 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:31.006 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:31.006 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:31.264 [2024-07-12 10:45:06.336657] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:31.264 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:31.264 "name": "Existed_Raid", 00:18:31.264 "aliases": [ 00:18:31.264 "034e1345-0e03-439f-992a-40c59db6e13e" 00:18:31.264 ], 00:18:31.264 "product_name": "Raid Volume", 00:18:31.264 "block_size": 512, 00:18:31.264 "num_blocks": 253952, 00:18:31.264 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:31.264 "assigned_rate_limits": { 00:18:31.264 "rw_ios_per_sec": 0, 00:18:31.264 "rw_mbytes_per_sec": 0, 00:18:31.264 "r_mbytes_per_sec": 0, 00:18:31.264 "w_mbytes_per_sec": 0 00:18:31.264 }, 00:18:31.264 "claimed": false, 00:18:31.264 "zoned": false, 00:18:31.264 "supported_io_types": { 00:18:31.264 "read": true, 00:18:31.264 "write": true, 00:18:31.264 "unmap": true, 00:18:31.264 "flush": true, 00:18:31.264 "reset": true, 00:18:31.264 "nvme_admin": false, 00:18:31.264 "nvme_io": false, 00:18:31.264 "nvme_io_md": false, 00:18:31.264 "write_zeroes": true, 00:18:31.264 "zcopy": false, 00:18:31.264 "get_zone_info": false, 00:18:31.264 "zone_management": false, 00:18:31.264 "zone_append": false, 00:18:31.264 "compare": false, 00:18:31.264 "compare_and_write": false, 00:18:31.264 "abort": false, 00:18:31.264 "seek_hole": false, 00:18:31.264 "seek_data": false, 00:18:31.264 "copy": false, 00:18:31.264 "nvme_iov_md": false 00:18:31.264 }, 00:18:31.264 "memory_domains": [ 00:18:31.264 { 00:18:31.264 "dma_device_id": "system", 00:18:31.264 "dma_device_type": 1 00:18:31.264 }, 00:18:31.264 { 00:18:31.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.264 "dma_device_type": 2 00:18:31.264 }, 00:18:31.264 { 00:18:31.264 "dma_device_id": "system", 00:18:31.264 "dma_device_type": 1 00:18:31.264 }, 00:18:31.264 { 00:18:31.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.264 "dma_device_type": 2 00:18:31.264 }, 00:18:31.264 { 00:18:31.264 "dma_device_id": "system", 00:18:31.264 "dma_device_type": 1 00:18:31.264 }, 00:18:31.264 { 00:18:31.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.264 "dma_device_type": 2 00:18:31.264 }, 00:18:31.264 { 00:18:31.264 "dma_device_id": "system", 00:18:31.264 "dma_device_type": 1 00:18:31.264 }, 00:18:31.264 { 00:18:31.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.265 "dma_device_type": 2 00:18:31.265 } 00:18:31.265 ], 00:18:31.265 "driver_specific": { 00:18:31.265 "raid": { 00:18:31.265 "uuid": "034e1345-0e03-439f-992a-40c59db6e13e", 00:18:31.265 "strip_size_kb": 64, 00:18:31.265 "state": "online", 00:18:31.265 "raid_level": "raid0", 00:18:31.265 "superblock": true, 00:18:31.265 "num_base_bdevs": 4, 00:18:31.265 "num_base_bdevs_discovered": 4, 00:18:31.265 "num_base_bdevs_operational": 4, 00:18:31.265 "base_bdevs_list": [ 00:18:31.265 { 00:18:31.265 "name": "NewBaseBdev", 00:18:31.265 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:31.265 "is_configured": true, 00:18:31.265 "data_offset": 2048, 00:18:31.265 "data_size": 63488 00:18:31.265 }, 00:18:31.265 { 00:18:31.265 "name": "BaseBdev2", 00:18:31.265 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:31.265 "is_configured": true, 00:18:31.265 "data_offset": 2048, 00:18:31.265 "data_size": 63488 00:18:31.265 }, 00:18:31.265 { 00:18:31.265 "name": "BaseBdev3", 00:18:31.265 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:31.265 "is_configured": true, 00:18:31.265 "data_offset": 2048, 00:18:31.265 "data_size": 63488 00:18:31.265 }, 00:18:31.265 { 00:18:31.265 "name": "BaseBdev4", 00:18:31.265 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:31.265 "is_configured": true, 00:18:31.265 "data_offset": 2048, 00:18:31.265 "data_size": 63488 00:18:31.265 } 00:18:31.265 ] 00:18:31.265 } 00:18:31.265 } 00:18:31.265 }' 00:18:31.265 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:31.265 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:31.265 BaseBdev2 00:18:31.265 BaseBdev3 00:18:31.265 BaseBdev4' 00:18:31.265 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:31.265 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:31.265 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:31.523 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:31.523 "name": "NewBaseBdev", 00:18:31.523 "aliases": [ 00:18:31.523 "1b0d986c-3c69-4a40-8460-88cfd05df3b9" 00:18:31.523 ], 00:18:31.523 "product_name": "Malloc disk", 00:18:31.523 "block_size": 512, 00:18:31.523 "num_blocks": 65536, 00:18:31.523 "uuid": "1b0d986c-3c69-4a40-8460-88cfd05df3b9", 00:18:31.523 "assigned_rate_limits": { 00:18:31.523 "rw_ios_per_sec": 0, 00:18:31.523 "rw_mbytes_per_sec": 0, 00:18:31.523 "r_mbytes_per_sec": 0, 00:18:31.523 "w_mbytes_per_sec": 0 00:18:31.523 }, 00:18:31.523 "claimed": true, 00:18:31.523 "claim_type": "exclusive_write", 00:18:31.523 "zoned": false, 00:18:31.523 "supported_io_types": { 00:18:31.523 "read": true, 00:18:31.523 "write": true, 00:18:31.523 "unmap": true, 00:18:31.523 "flush": true, 00:18:31.523 "reset": true, 00:18:31.523 "nvme_admin": false, 00:18:31.523 "nvme_io": false, 00:18:31.523 "nvme_io_md": false, 00:18:31.523 "write_zeroes": true, 00:18:31.523 "zcopy": true, 00:18:31.523 "get_zone_info": false, 00:18:31.523 "zone_management": false, 00:18:31.523 "zone_append": false, 00:18:31.523 "compare": false, 00:18:31.523 "compare_and_write": false, 00:18:31.523 "abort": true, 00:18:31.524 "seek_hole": false, 00:18:31.524 "seek_data": false, 00:18:31.524 "copy": true, 00:18:31.524 "nvme_iov_md": false 00:18:31.524 }, 00:18:31.524 "memory_domains": [ 00:18:31.524 { 00:18:31.524 "dma_device_id": "system", 00:18:31.524 "dma_device_type": 1 00:18:31.524 }, 00:18:31.524 { 00:18:31.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.524 "dma_device_type": 2 00:18:31.524 } 00:18:31.524 ], 00:18:31.524 "driver_specific": {} 00:18:31.524 }' 00:18:31.524 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.524 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.782 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:31.782 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.782 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.782 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:31.782 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.782 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.782 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:31.782 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.782 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.041 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.041 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:32.041 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:32.041 10:45:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:32.300 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:32.300 "name": "BaseBdev2", 00:18:32.300 "aliases": [ 00:18:32.300 "d599d874-2efa-4c8f-946d-dbc863bf0291" 00:18:32.300 ], 00:18:32.300 "product_name": "Malloc disk", 00:18:32.300 "block_size": 512, 00:18:32.300 "num_blocks": 65536, 00:18:32.300 "uuid": "d599d874-2efa-4c8f-946d-dbc863bf0291", 00:18:32.300 "assigned_rate_limits": { 00:18:32.300 "rw_ios_per_sec": 0, 00:18:32.300 "rw_mbytes_per_sec": 0, 00:18:32.300 "r_mbytes_per_sec": 0, 00:18:32.300 "w_mbytes_per_sec": 0 00:18:32.300 }, 00:18:32.300 "claimed": true, 00:18:32.300 "claim_type": "exclusive_write", 00:18:32.300 "zoned": false, 00:18:32.300 "supported_io_types": { 00:18:32.300 "read": true, 00:18:32.300 "write": true, 00:18:32.300 "unmap": true, 00:18:32.300 "flush": true, 00:18:32.300 "reset": true, 00:18:32.300 "nvme_admin": false, 00:18:32.300 "nvme_io": false, 00:18:32.301 "nvme_io_md": false, 00:18:32.301 "write_zeroes": true, 00:18:32.301 "zcopy": true, 00:18:32.301 "get_zone_info": false, 00:18:32.301 "zone_management": false, 00:18:32.301 "zone_append": false, 00:18:32.301 "compare": false, 00:18:32.301 "compare_and_write": false, 00:18:32.301 "abort": true, 00:18:32.301 "seek_hole": false, 00:18:32.301 "seek_data": false, 00:18:32.301 "copy": true, 00:18:32.301 "nvme_iov_md": false 00:18:32.301 }, 00:18:32.301 "memory_domains": [ 00:18:32.301 { 00:18:32.301 "dma_device_id": "system", 00:18:32.301 "dma_device_type": 1 00:18:32.301 }, 00:18:32.301 { 00:18:32.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.301 "dma_device_type": 2 00:18:32.301 } 00:18:32.301 ], 00:18:32.301 "driver_specific": {} 00:18:32.301 }' 00:18:32.301 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.301 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.301 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:32.301 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.301 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.301 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:32.301 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.301 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.560 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.560 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.560 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.560 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.560 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:32.560 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:32.560 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:32.819 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:32.819 "name": "BaseBdev3", 00:18:32.819 "aliases": [ 00:18:32.819 "b711a99a-2e17-45f5-b57d-7ca1a02ccffb" 00:18:32.819 ], 00:18:32.819 "product_name": "Malloc disk", 00:18:32.819 "block_size": 512, 00:18:32.819 "num_blocks": 65536, 00:18:32.819 "uuid": "b711a99a-2e17-45f5-b57d-7ca1a02ccffb", 00:18:32.819 "assigned_rate_limits": { 00:18:32.819 "rw_ios_per_sec": 0, 00:18:32.819 "rw_mbytes_per_sec": 0, 00:18:32.819 "r_mbytes_per_sec": 0, 00:18:32.819 "w_mbytes_per_sec": 0 00:18:32.819 }, 00:18:32.819 "claimed": true, 00:18:32.819 "claim_type": "exclusive_write", 00:18:32.819 "zoned": false, 00:18:32.819 "supported_io_types": { 00:18:32.819 "read": true, 00:18:32.819 "write": true, 00:18:32.819 "unmap": true, 00:18:32.819 "flush": true, 00:18:32.819 "reset": true, 00:18:32.819 "nvme_admin": false, 00:18:32.819 "nvme_io": false, 00:18:32.819 "nvme_io_md": false, 00:18:32.819 "write_zeroes": true, 00:18:32.819 "zcopy": true, 00:18:32.819 "get_zone_info": false, 00:18:32.819 "zone_management": false, 00:18:32.819 "zone_append": false, 00:18:32.819 "compare": false, 00:18:32.819 "compare_and_write": false, 00:18:32.819 "abort": true, 00:18:32.819 "seek_hole": false, 00:18:32.819 "seek_data": false, 00:18:32.819 "copy": true, 00:18:32.819 "nvme_iov_md": false 00:18:32.819 }, 00:18:32.819 "memory_domains": [ 00:18:32.819 { 00:18:32.819 "dma_device_id": "system", 00:18:32.819 "dma_device_type": 1 00:18:32.819 }, 00:18:32.819 { 00:18:32.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.819 "dma_device_type": 2 00:18:32.819 } 00:18:32.819 ], 00:18:32.819 "driver_specific": {} 00:18:32.819 }' 00:18:32.819 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.819 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.819 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:32.819 10:45:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.078 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.078 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:33.078 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.078 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.078 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:33.078 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.078 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.338 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:33.338 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:33.338 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:33.338 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:33.597 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:33.597 "name": "BaseBdev4", 00:18:33.597 "aliases": [ 00:18:33.597 "65d2f885-9d55-4dff-bbca-5d65a5122ba4" 00:18:33.597 ], 00:18:33.597 "product_name": "Malloc disk", 00:18:33.597 "block_size": 512, 00:18:33.597 "num_blocks": 65536, 00:18:33.597 "uuid": "65d2f885-9d55-4dff-bbca-5d65a5122ba4", 00:18:33.597 "assigned_rate_limits": { 00:18:33.597 "rw_ios_per_sec": 0, 00:18:33.597 "rw_mbytes_per_sec": 0, 00:18:33.597 "r_mbytes_per_sec": 0, 00:18:33.597 "w_mbytes_per_sec": 0 00:18:33.597 }, 00:18:33.597 "claimed": true, 00:18:33.597 "claim_type": "exclusive_write", 00:18:33.597 "zoned": false, 00:18:33.597 "supported_io_types": { 00:18:33.597 "read": true, 00:18:33.597 "write": true, 00:18:33.597 "unmap": true, 00:18:33.597 "flush": true, 00:18:33.597 "reset": true, 00:18:33.597 "nvme_admin": false, 00:18:33.597 "nvme_io": false, 00:18:33.597 "nvme_io_md": false, 00:18:33.597 "write_zeroes": true, 00:18:33.597 "zcopy": true, 00:18:33.597 "get_zone_info": false, 00:18:33.597 "zone_management": false, 00:18:33.597 "zone_append": false, 00:18:33.597 "compare": false, 00:18:33.597 "compare_and_write": false, 00:18:33.597 "abort": true, 00:18:33.597 "seek_hole": false, 00:18:33.597 "seek_data": false, 00:18:33.597 "copy": true, 00:18:33.597 "nvme_iov_md": false 00:18:33.597 }, 00:18:33.597 "memory_domains": [ 00:18:33.597 { 00:18:33.597 "dma_device_id": "system", 00:18:33.597 "dma_device_type": 1 00:18:33.597 }, 00:18:33.597 { 00:18:33.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.597 "dma_device_type": 2 00:18:33.597 } 00:18:33.597 ], 00:18:33.597 "driver_specific": {} 00:18:33.597 }' 00:18:33.597 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.597 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.597 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:33.597 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.597 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.597 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:33.597 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.597 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.856 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:33.856 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.856 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.856 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:33.856 10:45:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:34.115 [2024-07-12 10:45:09.143798] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:34.115 [2024-07-12 10:45:09.143823] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:34.115 [2024-07-12 10:45:09.143874] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:34.115 [2024-07-12 10:45:09.143934] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:34.115 [2024-07-12 10:45:09.143945] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfe3470 name Existed_Raid, state offline 00:18:34.115 10:45:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2080502 00:18:34.115 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2080502 ']' 00:18:34.115 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2080502 00:18:34.115 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:34.115 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:34.115 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2080502 00:18:34.115 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:34.115 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:34.116 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2080502' 00:18:34.116 killing process with pid 2080502 00:18:34.116 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2080502 00:18:34.116 [2024-07-12 10:45:09.213169] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:34.116 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2080502 00:18:34.116 [2024-07-12 10:45:09.251438] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:34.374 10:45:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:34.374 00:18:34.374 real 0m32.384s 00:18:34.374 user 0m59.535s 00:18:34.374 sys 0m5.731s 00:18:34.374 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:34.374 10:45:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:34.374 ************************************ 00:18:34.374 END TEST raid_state_function_test_sb 00:18:34.374 ************************************ 00:18:34.374 10:45:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:34.374 10:45:09 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:18:34.374 10:45:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:34.374 10:45:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:34.374 10:45:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:34.374 ************************************ 00:18:34.374 START TEST raid_superblock_test 00:18:34.374 ************************************ 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2085895 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2085895 /var/tmp/spdk-raid.sock 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2085895 ']' 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:34.374 10:45:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:34.375 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:34.375 10:45:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:34.375 10:45:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:34.633 [2024-07-12 10:45:09.585837] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:18:34.633 [2024-07-12 10:45:09.585883] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2085895 ] 00:18:34.633 [2024-07-12 10:45:09.698657] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:34.633 [2024-07-12 10:45:09.802358] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:34.892 [2024-07-12 10:45:09.865201] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:34.892 [2024-07-12 10:45:09.865237] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:35.462 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:35.721 malloc1 00:18:35.721 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:35.721 [2024-07-12 10:45:10.853748] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:35.721 [2024-07-12 10:45:10.853797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:35.721 [2024-07-12 10:45:10.853820] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ebd570 00:18:35.722 [2024-07-12 10:45:10.853833] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:35.722 [2024-07-12 10:45:10.855564] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:35.722 [2024-07-12 10:45:10.855593] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:35.722 pt1 00:18:35.722 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:35.722 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:35.722 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:35.722 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:35.722 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:35.722 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:35.722 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:35.722 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:35.722 10:45:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:35.980 malloc2 00:18:35.980 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:36.244 [2024-07-12 10:45:11.260843] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:36.244 [2024-07-12 10:45:11.260895] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.244 [2024-07-12 10:45:11.260913] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ebe970 00:18:36.244 [2024-07-12 10:45:11.260926] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.244 [2024-07-12 10:45:11.262570] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.245 [2024-07-12 10:45:11.262599] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:36.245 pt2 00:18:36.245 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:36.245 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:36.245 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:36.245 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:36.245 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:36.245 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:36.245 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:36.245 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:36.245 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:36.564 malloc3 00:18:36.564 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:36.564 [2024-07-12 10:45:11.662514] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:36.564 [2024-07-12 10:45:11.662562] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.564 [2024-07-12 10:45:11.662579] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2055340 00:18:36.564 [2024-07-12 10:45:11.662592] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.564 [2024-07-12 10:45:11.664123] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.564 [2024-07-12 10:45:11.664153] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:36.565 pt3 00:18:36.565 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:36.565 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:36.565 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:36.565 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:36.565 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:36.565 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:36.565 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:36.565 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:36.565 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:36.823 malloc4 00:18:36.823 10:45:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:37.082 [2024-07-12 10:45:12.064140] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:37.082 [2024-07-12 10:45:12.064185] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:37.082 [2024-07-12 10:45:12.064206] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2057c60 00:18:37.082 [2024-07-12 10:45:12.064219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:37.082 [2024-07-12 10:45:12.065755] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:37.082 [2024-07-12 10:45:12.065783] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:37.082 pt4 00:18:37.082 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:37.082 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:37.082 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:37.341 [2024-07-12 10:45:12.292802] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:37.341 [2024-07-12 10:45:12.294172] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:37.341 [2024-07-12 10:45:12.294229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:37.341 [2024-07-12 10:45:12.294273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:37.341 [2024-07-12 10:45:12.294443] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1eb5530 00:18:37.341 [2024-07-12 10:45:12.294454] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:37.341 [2024-07-12 10:45:12.294668] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eb3770 00:18:37.341 [2024-07-12 10:45:12.294815] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1eb5530 00:18:37.341 [2024-07-12 10:45:12.294826] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1eb5530 00:18:37.341 [2024-07-12 10:45:12.294925] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.341 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.342 "name": "raid_bdev1", 00:18:37.342 "uuid": "8f61e88e-322c-490c-83b0-387f52179d67", 00:18:37.342 "strip_size_kb": 64, 00:18:37.342 "state": "online", 00:18:37.342 "raid_level": "raid0", 00:18:37.342 "superblock": true, 00:18:37.342 "num_base_bdevs": 4, 00:18:37.342 "num_base_bdevs_discovered": 4, 00:18:37.342 "num_base_bdevs_operational": 4, 00:18:37.342 "base_bdevs_list": [ 00:18:37.342 { 00:18:37.342 "name": "pt1", 00:18:37.342 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:37.342 "is_configured": true, 00:18:37.342 "data_offset": 2048, 00:18:37.342 "data_size": 63488 00:18:37.342 }, 00:18:37.342 { 00:18:37.342 "name": "pt2", 00:18:37.342 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:37.342 "is_configured": true, 00:18:37.342 "data_offset": 2048, 00:18:37.342 "data_size": 63488 00:18:37.342 }, 00:18:37.342 { 00:18:37.342 "name": "pt3", 00:18:37.342 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:37.342 "is_configured": true, 00:18:37.342 "data_offset": 2048, 00:18:37.342 "data_size": 63488 00:18:37.342 }, 00:18:37.342 { 00:18:37.342 "name": "pt4", 00:18:37.342 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:37.342 "is_configured": true, 00:18:37.342 "data_offset": 2048, 00:18:37.342 "data_size": 63488 00:18:37.342 } 00:18:37.342 ] 00:18:37.342 }' 00:18:37.342 10:45:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.342 10:45:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.275 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:38.275 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:38.275 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:38.275 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:38.275 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:38.275 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:38.275 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:38.275 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:38.532 [2024-07-12 10:45:13.548394] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:38.532 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:38.532 "name": "raid_bdev1", 00:18:38.532 "aliases": [ 00:18:38.532 "8f61e88e-322c-490c-83b0-387f52179d67" 00:18:38.532 ], 00:18:38.532 "product_name": "Raid Volume", 00:18:38.532 "block_size": 512, 00:18:38.532 "num_blocks": 253952, 00:18:38.532 "uuid": "8f61e88e-322c-490c-83b0-387f52179d67", 00:18:38.532 "assigned_rate_limits": { 00:18:38.532 "rw_ios_per_sec": 0, 00:18:38.532 "rw_mbytes_per_sec": 0, 00:18:38.532 "r_mbytes_per_sec": 0, 00:18:38.532 "w_mbytes_per_sec": 0 00:18:38.532 }, 00:18:38.532 "claimed": false, 00:18:38.532 "zoned": false, 00:18:38.532 "supported_io_types": { 00:18:38.532 "read": true, 00:18:38.532 "write": true, 00:18:38.532 "unmap": true, 00:18:38.532 "flush": true, 00:18:38.532 "reset": true, 00:18:38.532 "nvme_admin": false, 00:18:38.532 "nvme_io": false, 00:18:38.532 "nvme_io_md": false, 00:18:38.532 "write_zeroes": true, 00:18:38.532 "zcopy": false, 00:18:38.532 "get_zone_info": false, 00:18:38.532 "zone_management": false, 00:18:38.532 "zone_append": false, 00:18:38.532 "compare": false, 00:18:38.532 "compare_and_write": false, 00:18:38.532 "abort": false, 00:18:38.532 "seek_hole": false, 00:18:38.532 "seek_data": false, 00:18:38.532 "copy": false, 00:18:38.532 "nvme_iov_md": false 00:18:38.532 }, 00:18:38.532 "memory_domains": [ 00:18:38.532 { 00:18:38.532 "dma_device_id": "system", 00:18:38.532 "dma_device_type": 1 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.532 "dma_device_type": 2 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "dma_device_id": "system", 00:18:38.532 "dma_device_type": 1 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.532 "dma_device_type": 2 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "dma_device_id": "system", 00:18:38.532 "dma_device_type": 1 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.532 "dma_device_type": 2 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "dma_device_id": "system", 00:18:38.532 "dma_device_type": 1 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.532 "dma_device_type": 2 00:18:38.532 } 00:18:38.532 ], 00:18:38.532 "driver_specific": { 00:18:38.532 "raid": { 00:18:38.532 "uuid": "8f61e88e-322c-490c-83b0-387f52179d67", 00:18:38.532 "strip_size_kb": 64, 00:18:38.532 "state": "online", 00:18:38.532 "raid_level": "raid0", 00:18:38.532 "superblock": true, 00:18:38.532 "num_base_bdevs": 4, 00:18:38.532 "num_base_bdevs_discovered": 4, 00:18:38.532 "num_base_bdevs_operational": 4, 00:18:38.532 "base_bdevs_list": [ 00:18:38.532 { 00:18:38.532 "name": "pt1", 00:18:38.532 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:38.532 "is_configured": true, 00:18:38.532 "data_offset": 2048, 00:18:38.532 "data_size": 63488 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "name": "pt2", 00:18:38.532 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:38.532 "is_configured": true, 00:18:38.532 "data_offset": 2048, 00:18:38.532 "data_size": 63488 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "name": "pt3", 00:18:38.532 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:38.532 "is_configured": true, 00:18:38.532 "data_offset": 2048, 00:18:38.532 "data_size": 63488 00:18:38.532 }, 00:18:38.532 { 00:18:38.532 "name": "pt4", 00:18:38.532 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:38.532 "is_configured": true, 00:18:38.532 "data_offset": 2048, 00:18:38.532 "data_size": 63488 00:18:38.532 } 00:18:38.532 ] 00:18:38.532 } 00:18:38.532 } 00:18:38.532 }' 00:18:38.532 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:38.532 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:38.532 pt2 00:18:38.532 pt3 00:18:38.532 pt4' 00:18:38.532 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.532 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:38.532 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.789 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.789 "name": "pt1", 00:18:38.789 "aliases": [ 00:18:38.789 "00000000-0000-0000-0000-000000000001" 00:18:38.789 ], 00:18:38.789 "product_name": "passthru", 00:18:38.789 "block_size": 512, 00:18:38.789 "num_blocks": 65536, 00:18:38.789 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:38.789 "assigned_rate_limits": { 00:18:38.789 "rw_ios_per_sec": 0, 00:18:38.789 "rw_mbytes_per_sec": 0, 00:18:38.789 "r_mbytes_per_sec": 0, 00:18:38.789 "w_mbytes_per_sec": 0 00:18:38.789 }, 00:18:38.789 "claimed": true, 00:18:38.789 "claim_type": "exclusive_write", 00:18:38.789 "zoned": false, 00:18:38.789 "supported_io_types": { 00:18:38.789 "read": true, 00:18:38.789 "write": true, 00:18:38.789 "unmap": true, 00:18:38.789 "flush": true, 00:18:38.789 "reset": true, 00:18:38.789 "nvme_admin": false, 00:18:38.789 "nvme_io": false, 00:18:38.789 "nvme_io_md": false, 00:18:38.789 "write_zeroes": true, 00:18:38.789 "zcopy": true, 00:18:38.789 "get_zone_info": false, 00:18:38.789 "zone_management": false, 00:18:38.789 "zone_append": false, 00:18:38.789 "compare": false, 00:18:38.789 "compare_and_write": false, 00:18:38.789 "abort": true, 00:18:38.789 "seek_hole": false, 00:18:38.789 "seek_data": false, 00:18:38.789 "copy": true, 00:18:38.789 "nvme_iov_md": false 00:18:38.789 }, 00:18:38.789 "memory_domains": [ 00:18:38.789 { 00:18:38.789 "dma_device_id": "system", 00:18:38.789 "dma_device_type": 1 00:18:38.789 }, 00:18:38.789 { 00:18:38.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.789 "dma_device_type": 2 00:18:38.789 } 00:18:38.789 ], 00:18:38.789 "driver_specific": { 00:18:38.789 "passthru": { 00:18:38.789 "name": "pt1", 00:18:38.789 "base_bdev_name": "malloc1" 00:18:38.789 } 00:18:38.789 } 00:18:38.789 }' 00:18:38.789 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.789 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.789 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:38.789 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.046 10:45:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:39.046 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:39.303 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:39.303 "name": "pt2", 00:18:39.303 "aliases": [ 00:18:39.303 "00000000-0000-0000-0000-000000000002" 00:18:39.303 ], 00:18:39.303 "product_name": "passthru", 00:18:39.303 "block_size": 512, 00:18:39.303 "num_blocks": 65536, 00:18:39.303 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:39.303 "assigned_rate_limits": { 00:18:39.303 "rw_ios_per_sec": 0, 00:18:39.303 "rw_mbytes_per_sec": 0, 00:18:39.303 "r_mbytes_per_sec": 0, 00:18:39.303 "w_mbytes_per_sec": 0 00:18:39.303 }, 00:18:39.303 "claimed": true, 00:18:39.303 "claim_type": "exclusive_write", 00:18:39.303 "zoned": false, 00:18:39.303 "supported_io_types": { 00:18:39.303 "read": true, 00:18:39.303 "write": true, 00:18:39.303 "unmap": true, 00:18:39.303 "flush": true, 00:18:39.303 "reset": true, 00:18:39.303 "nvme_admin": false, 00:18:39.303 "nvme_io": false, 00:18:39.303 "nvme_io_md": false, 00:18:39.303 "write_zeroes": true, 00:18:39.303 "zcopy": true, 00:18:39.303 "get_zone_info": false, 00:18:39.303 "zone_management": false, 00:18:39.303 "zone_append": false, 00:18:39.303 "compare": false, 00:18:39.303 "compare_and_write": false, 00:18:39.303 "abort": true, 00:18:39.303 "seek_hole": false, 00:18:39.303 "seek_data": false, 00:18:39.303 "copy": true, 00:18:39.303 "nvme_iov_md": false 00:18:39.303 }, 00:18:39.303 "memory_domains": [ 00:18:39.303 { 00:18:39.303 "dma_device_id": "system", 00:18:39.303 "dma_device_type": 1 00:18:39.303 }, 00:18:39.303 { 00:18:39.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.303 "dma_device_type": 2 00:18:39.303 } 00:18:39.303 ], 00:18:39.303 "driver_specific": { 00:18:39.303 "passthru": { 00:18:39.303 "name": "pt2", 00:18:39.303 "base_bdev_name": "malloc2" 00:18:39.303 } 00:18:39.303 } 00:18:39.303 }' 00:18:39.303 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.560 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.560 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:39.560 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.560 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.560 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:39.560 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.560 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.560 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:39.560 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.817 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.817 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:39.817 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:39.817 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:39.817 10:45:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:40.074 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:40.074 "name": "pt3", 00:18:40.074 "aliases": [ 00:18:40.074 "00000000-0000-0000-0000-000000000003" 00:18:40.074 ], 00:18:40.074 "product_name": "passthru", 00:18:40.074 "block_size": 512, 00:18:40.074 "num_blocks": 65536, 00:18:40.074 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:40.074 "assigned_rate_limits": { 00:18:40.074 "rw_ios_per_sec": 0, 00:18:40.074 "rw_mbytes_per_sec": 0, 00:18:40.074 "r_mbytes_per_sec": 0, 00:18:40.074 "w_mbytes_per_sec": 0 00:18:40.074 }, 00:18:40.074 "claimed": true, 00:18:40.074 "claim_type": "exclusive_write", 00:18:40.074 "zoned": false, 00:18:40.074 "supported_io_types": { 00:18:40.074 "read": true, 00:18:40.074 "write": true, 00:18:40.074 "unmap": true, 00:18:40.075 "flush": true, 00:18:40.075 "reset": true, 00:18:40.075 "nvme_admin": false, 00:18:40.075 "nvme_io": false, 00:18:40.075 "nvme_io_md": false, 00:18:40.075 "write_zeroes": true, 00:18:40.075 "zcopy": true, 00:18:40.075 "get_zone_info": false, 00:18:40.075 "zone_management": false, 00:18:40.075 "zone_append": false, 00:18:40.075 "compare": false, 00:18:40.075 "compare_and_write": false, 00:18:40.075 "abort": true, 00:18:40.075 "seek_hole": false, 00:18:40.075 "seek_data": false, 00:18:40.075 "copy": true, 00:18:40.075 "nvme_iov_md": false 00:18:40.075 }, 00:18:40.075 "memory_domains": [ 00:18:40.075 { 00:18:40.075 "dma_device_id": "system", 00:18:40.075 "dma_device_type": 1 00:18:40.075 }, 00:18:40.075 { 00:18:40.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.075 "dma_device_type": 2 00:18:40.075 } 00:18:40.075 ], 00:18:40.075 "driver_specific": { 00:18:40.075 "passthru": { 00:18:40.075 "name": "pt3", 00:18:40.075 "base_bdev_name": "malloc3" 00:18:40.075 } 00:18:40.075 } 00:18:40.075 }' 00:18:40.075 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:40.075 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:40.075 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:40.075 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:40.075 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:40.075 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:40.075 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:40.075 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:40.332 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:40.332 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:40.332 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:40.332 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:40.332 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:40.332 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:40.332 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:40.589 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:40.589 "name": "pt4", 00:18:40.589 "aliases": [ 00:18:40.589 "00000000-0000-0000-0000-000000000004" 00:18:40.589 ], 00:18:40.589 "product_name": "passthru", 00:18:40.589 "block_size": 512, 00:18:40.589 "num_blocks": 65536, 00:18:40.589 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:40.589 "assigned_rate_limits": { 00:18:40.589 "rw_ios_per_sec": 0, 00:18:40.589 "rw_mbytes_per_sec": 0, 00:18:40.589 "r_mbytes_per_sec": 0, 00:18:40.589 "w_mbytes_per_sec": 0 00:18:40.589 }, 00:18:40.589 "claimed": true, 00:18:40.589 "claim_type": "exclusive_write", 00:18:40.589 "zoned": false, 00:18:40.589 "supported_io_types": { 00:18:40.589 "read": true, 00:18:40.589 "write": true, 00:18:40.589 "unmap": true, 00:18:40.589 "flush": true, 00:18:40.589 "reset": true, 00:18:40.589 "nvme_admin": false, 00:18:40.589 "nvme_io": false, 00:18:40.589 "nvme_io_md": false, 00:18:40.589 "write_zeroes": true, 00:18:40.589 "zcopy": true, 00:18:40.589 "get_zone_info": false, 00:18:40.589 "zone_management": false, 00:18:40.589 "zone_append": false, 00:18:40.589 "compare": false, 00:18:40.589 "compare_and_write": false, 00:18:40.589 "abort": true, 00:18:40.589 "seek_hole": false, 00:18:40.589 "seek_data": false, 00:18:40.589 "copy": true, 00:18:40.589 "nvme_iov_md": false 00:18:40.589 }, 00:18:40.589 "memory_domains": [ 00:18:40.589 { 00:18:40.589 "dma_device_id": "system", 00:18:40.589 "dma_device_type": 1 00:18:40.589 }, 00:18:40.589 { 00:18:40.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.589 "dma_device_type": 2 00:18:40.589 } 00:18:40.589 ], 00:18:40.589 "driver_specific": { 00:18:40.589 "passthru": { 00:18:40.589 "name": "pt4", 00:18:40.589 "base_bdev_name": "malloc4" 00:18:40.589 } 00:18:40.589 } 00:18:40.589 }' 00:18:40.589 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:40.589 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:40.589 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:40.589 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:40.589 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:40.847 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:40.847 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:40.847 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:40.847 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:40.847 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:40.847 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:40.847 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:40.847 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:40.847 10:45:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:41.105 [2024-07-12 10:45:16.215440] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:41.105 10:45:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8f61e88e-322c-490c-83b0-387f52179d67 00:18:41.105 10:45:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8f61e88e-322c-490c-83b0-387f52179d67 ']' 00:18:41.105 10:45:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:41.671 [2024-07-12 10:45:16.716466] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:41.671 [2024-07-12 10:45:16.716495] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:41.671 [2024-07-12 10:45:16.716548] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:41.671 [2024-07-12 10:45:16.716611] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:41.671 [2024-07-12 10:45:16.716623] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eb5530 name raid_bdev1, state offline 00:18:41.671 10:45:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.671 10:45:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:41.929 10:45:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:41.929 10:45:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:41.929 10:45:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:41.929 10:45:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:42.496 10:45:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:42.496 10:45:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:42.754 10:45:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:42.754 10:45:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:43.320 10:45:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:43.320 10:45:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:43.320 10:45:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:43.320 10:45:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:43.579 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:43.838 [2024-07-12 10:45:18.962298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:43.838 [2024-07-12 10:45:18.963647] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:43.838 [2024-07-12 10:45:18.963690] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:43.838 [2024-07-12 10:45:18.963724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:43.838 [2024-07-12 10:45:18.963768] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:43.838 [2024-07-12 10:45:18.963806] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:43.838 [2024-07-12 10:45:18.963828] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:43.838 [2024-07-12 10:45:18.963850] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:43.838 [2024-07-12 10:45:18.963868] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:43.838 [2024-07-12 10:45:18.963878] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2060ff0 name raid_bdev1, state configuring 00:18:43.838 request: 00:18:43.838 { 00:18:43.838 "name": "raid_bdev1", 00:18:43.838 "raid_level": "raid0", 00:18:43.838 "base_bdevs": [ 00:18:43.838 "malloc1", 00:18:43.838 "malloc2", 00:18:43.838 "malloc3", 00:18:43.838 "malloc4" 00:18:43.838 ], 00:18:43.838 "strip_size_kb": 64, 00:18:43.838 "superblock": false, 00:18:43.838 "method": "bdev_raid_create", 00:18:43.838 "req_id": 1 00:18:43.838 } 00:18:43.838 Got JSON-RPC error response 00:18:43.838 response: 00:18:43.838 { 00:18:43.838 "code": -17, 00:18:43.838 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:43.838 } 00:18:43.838 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:43.838 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:43.838 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:43.838 10:45:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:43.838 10:45:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.838 10:45:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:44.097 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:44.097 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:44.097 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:44.355 [2024-07-12 10:45:19.447517] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:44.355 [2024-07-12 10:45:19.447562] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:44.355 [2024-07-12 10:45:19.447585] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ebd7a0 00:18:44.355 [2024-07-12 10:45:19.447597] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:44.355 [2024-07-12 10:45:19.449221] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:44.355 [2024-07-12 10:45:19.449249] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:44.355 [2024-07-12 10:45:19.449315] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:44.355 [2024-07-12 10:45:19.449339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:44.355 pt1 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.355 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:44.613 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.613 "name": "raid_bdev1", 00:18:44.613 "uuid": "8f61e88e-322c-490c-83b0-387f52179d67", 00:18:44.613 "strip_size_kb": 64, 00:18:44.613 "state": "configuring", 00:18:44.613 "raid_level": "raid0", 00:18:44.613 "superblock": true, 00:18:44.613 "num_base_bdevs": 4, 00:18:44.613 "num_base_bdevs_discovered": 1, 00:18:44.613 "num_base_bdevs_operational": 4, 00:18:44.613 "base_bdevs_list": [ 00:18:44.613 { 00:18:44.613 "name": "pt1", 00:18:44.613 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:44.613 "is_configured": true, 00:18:44.613 "data_offset": 2048, 00:18:44.613 "data_size": 63488 00:18:44.613 }, 00:18:44.613 { 00:18:44.613 "name": null, 00:18:44.613 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:44.613 "is_configured": false, 00:18:44.613 "data_offset": 2048, 00:18:44.613 "data_size": 63488 00:18:44.613 }, 00:18:44.613 { 00:18:44.613 "name": null, 00:18:44.613 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:44.613 "is_configured": false, 00:18:44.613 "data_offset": 2048, 00:18:44.613 "data_size": 63488 00:18:44.613 }, 00:18:44.613 { 00:18:44.613 "name": null, 00:18:44.613 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:44.613 "is_configured": false, 00:18:44.613 "data_offset": 2048, 00:18:44.613 "data_size": 63488 00:18:44.613 } 00:18:44.613 ] 00:18:44.613 }' 00:18:44.613 10:45:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.613 10:45:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.180 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:45.180 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:45.439 [2024-07-12 10:45:20.538424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:45.439 [2024-07-12 10:45:20.538477] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:45.439 [2024-07-12 10:45:20.538503] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2056940 00:18:45.439 [2024-07-12 10:45:20.538515] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:45.439 [2024-07-12 10:45:20.538849] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:45.439 [2024-07-12 10:45:20.538867] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:45.439 [2024-07-12 10:45:20.538930] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:45.439 [2024-07-12 10:45:20.538949] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:45.439 pt2 00:18:45.439 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:45.699 [2024-07-12 10:45:20.779173] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:45.699 10:45:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.958 10:45:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.958 "name": "raid_bdev1", 00:18:45.958 "uuid": "8f61e88e-322c-490c-83b0-387f52179d67", 00:18:45.958 "strip_size_kb": 64, 00:18:45.958 "state": "configuring", 00:18:45.958 "raid_level": "raid0", 00:18:45.958 "superblock": true, 00:18:45.958 "num_base_bdevs": 4, 00:18:45.958 "num_base_bdevs_discovered": 1, 00:18:45.958 "num_base_bdevs_operational": 4, 00:18:45.958 "base_bdevs_list": [ 00:18:45.958 { 00:18:45.958 "name": "pt1", 00:18:45.958 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:45.958 "is_configured": true, 00:18:45.958 "data_offset": 2048, 00:18:45.958 "data_size": 63488 00:18:45.958 }, 00:18:45.958 { 00:18:45.958 "name": null, 00:18:45.958 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:45.958 "is_configured": false, 00:18:45.958 "data_offset": 2048, 00:18:45.958 "data_size": 63488 00:18:45.958 }, 00:18:45.958 { 00:18:45.958 "name": null, 00:18:45.958 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:45.958 "is_configured": false, 00:18:45.958 "data_offset": 2048, 00:18:45.958 "data_size": 63488 00:18:45.958 }, 00:18:45.958 { 00:18:45.958 "name": null, 00:18:45.958 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:45.958 "is_configured": false, 00:18:45.958 "data_offset": 2048, 00:18:45.958 "data_size": 63488 00:18:45.958 } 00:18:45.958 ] 00:18:45.958 }' 00:18:45.958 10:45:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.958 10:45:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.526 10:45:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:46.526 10:45:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:46.526 10:45:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:46.784 [2024-07-12 10:45:21.858019] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:46.784 [2024-07-12 10:45:21.858067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:46.784 [2024-07-12 10:45:21.858086] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb4060 00:18:46.784 [2024-07-12 10:45:21.858098] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:46.784 [2024-07-12 10:45:21.858423] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:46.784 [2024-07-12 10:45:21.858440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:46.784 [2024-07-12 10:45:21.858508] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:46.784 [2024-07-12 10:45:21.858527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:46.784 pt2 00:18:46.784 10:45:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:46.784 10:45:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:46.784 10:45:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:47.043 [2024-07-12 10:45:22.102673] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:47.043 [2024-07-12 10:45:22.102707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:47.043 [2024-07-12 10:45:22.102726] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb68d0 00:18:47.043 [2024-07-12 10:45:22.102739] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:47.043 [2024-07-12 10:45:22.103032] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:47.043 [2024-07-12 10:45:22.103049] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:47.043 [2024-07-12 10:45:22.103109] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:47.043 [2024-07-12 10:45:22.103127] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:47.043 pt3 00:18:47.043 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:47.043 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:47.043 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:47.302 [2024-07-12 10:45:22.343316] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:47.302 [2024-07-12 10:45:22.343355] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:47.302 [2024-07-12 10:45:22.343371] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb7b80 00:18:47.302 [2024-07-12 10:45:22.343383] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:47.302 [2024-07-12 10:45:22.343670] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:47.302 [2024-07-12 10:45:22.343689] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:47.302 [2024-07-12 10:45:22.343739] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:47.302 [2024-07-12 10:45:22.343757] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:47.302 [2024-07-12 10:45:22.343871] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1eb4780 00:18:47.302 [2024-07-12 10:45:22.343882] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:47.302 [2024-07-12 10:45:22.344052] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eb9d70 00:18:47.302 [2024-07-12 10:45:22.344177] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1eb4780 00:18:47.302 [2024-07-12 10:45:22.344187] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1eb4780 00:18:47.302 [2024-07-12 10:45:22.344282] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:47.302 pt4 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.302 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:47.561 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.561 "name": "raid_bdev1", 00:18:47.561 "uuid": "8f61e88e-322c-490c-83b0-387f52179d67", 00:18:47.561 "strip_size_kb": 64, 00:18:47.561 "state": "online", 00:18:47.561 "raid_level": "raid0", 00:18:47.561 "superblock": true, 00:18:47.561 "num_base_bdevs": 4, 00:18:47.561 "num_base_bdevs_discovered": 4, 00:18:47.561 "num_base_bdevs_operational": 4, 00:18:47.561 "base_bdevs_list": [ 00:18:47.561 { 00:18:47.561 "name": "pt1", 00:18:47.561 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:47.561 "is_configured": true, 00:18:47.561 "data_offset": 2048, 00:18:47.561 "data_size": 63488 00:18:47.561 }, 00:18:47.561 { 00:18:47.561 "name": "pt2", 00:18:47.561 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:47.561 "is_configured": true, 00:18:47.561 "data_offset": 2048, 00:18:47.561 "data_size": 63488 00:18:47.561 }, 00:18:47.561 { 00:18:47.561 "name": "pt3", 00:18:47.561 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:47.561 "is_configured": true, 00:18:47.561 "data_offset": 2048, 00:18:47.561 "data_size": 63488 00:18:47.561 }, 00:18:47.561 { 00:18:47.561 "name": "pt4", 00:18:47.561 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:47.561 "is_configured": true, 00:18:47.561 "data_offset": 2048, 00:18:47.561 "data_size": 63488 00:18:47.561 } 00:18:47.561 ] 00:18:47.561 }' 00:18:47.561 10:45:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.561 10:45:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:48.129 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:48.130 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:48.130 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:48.130 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:48.130 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:48.130 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:48.130 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:48.130 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:48.388 [2024-07-12 10:45:23.446558] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:48.388 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:48.388 "name": "raid_bdev1", 00:18:48.388 "aliases": [ 00:18:48.388 "8f61e88e-322c-490c-83b0-387f52179d67" 00:18:48.388 ], 00:18:48.388 "product_name": "Raid Volume", 00:18:48.388 "block_size": 512, 00:18:48.388 "num_blocks": 253952, 00:18:48.388 "uuid": "8f61e88e-322c-490c-83b0-387f52179d67", 00:18:48.388 "assigned_rate_limits": { 00:18:48.388 "rw_ios_per_sec": 0, 00:18:48.388 "rw_mbytes_per_sec": 0, 00:18:48.388 "r_mbytes_per_sec": 0, 00:18:48.388 "w_mbytes_per_sec": 0 00:18:48.388 }, 00:18:48.388 "claimed": false, 00:18:48.388 "zoned": false, 00:18:48.388 "supported_io_types": { 00:18:48.388 "read": true, 00:18:48.388 "write": true, 00:18:48.388 "unmap": true, 00:18:48.388 "flush": true, 00:18:48.388 "reset": true, 00:18:48.388 "nvme_admin": false, 00:18:48.388 "nvme_io": false, 00:18:48.388 "nvme_io_md": false, 00:18:48.388 "write_zeroes": true, 00:18:48.388 "zcopy": false, 00:18:48.388 "get_zone_info": false, 00:18:48.388 "zone_management": false, 00:18:48.388 "zone_append": false, 00:18:48.388 "compare": false, 00:18:48.388 "compare_and_write": false, 00:18:48.388 "abort": false, 00:18:48.388 "seek_hole": false, 00:18:48.388 "seek_data": false, 00:18:48.388 "copy": false, 00:18:48.388 "nvme_iov_md": false 00:18:48.388 }, 00:18:48.388 "memory_domains": [ 00:18:48.388 { 00:18:48.388 "dma_device_id": "system", 00:18:48.388 "dma_device_type": 1 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.388 "dma_device_type": 2 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "dma_device_id": "system", 00:18:48.388 "dma_device_type": 1 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.388 "dma_device_type": 2 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "dma_device_id": "system", 00:18:48.388 "dma_device_type": 1 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.388 "dma_device_type": 2 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "dma_device_id": "system", 00:18:48.388 "dma_device_type": 1 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.388 "dma_device_type": 2 00:18:48.388 } 00:18:48.388 ], 00:18:48.388 "driver_specific": { 00:18:48.388 "raid": { 00:18:48.388 "uuid": "8f61e88e-322c-490c-83b0-387f52179d67", 00:18:48.388 "strip_size_kb": 64, 00:18:48.388 "state": "online", 00:18:48.388 "raid_level": "raid0", 00:18:48.388 "superblock": true, 00:18:48.388 "num_base_bdevs": 4, 00:18:48.388 "num_base_bdevs_discovered": 4, 00:18:48.388 "num_base_bdevs_operational": 4, 00:18:48.388 "base_bdevs_list": [ 00:18:48.388 { 00:18:48.388 "name": "pt1", 00:18:48.388 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:48.388 "is_configured": true, 00:18:48.388 "data_offset": 2048, 00:18:48.388 "data_size": 63488 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "name": "pt2", 00:18:48.388 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:48.388 "is_configured": true, 00:18:48.388 "data_offset": 2048, 00:18:48.388 "data_size": 63488 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "name": "pt3", 00:18:48.388 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:48.388 "is_configured": true, 00:18:48.388 "data_offset": 2048, 00:18:48.388 "data_size": 63488 00:18:48.388 }, 00:18:48.388 { 00:18:48.388 "name": "pt4", 00:18:48.388 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:48.388 "is_configured": true, 00:18:48.388 "data_offset": 2048, 00:18:48.388 "data_size": 63488 00:18:48.388 } 00:18:48.388 ] 00:18:48.388 } 00:18:48.388 } 00:18:48.388 }' 00:18:48.388 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:48.388 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:48.388 pt2 00:18:48.388 pt3 00:18:48.388 pt4' 00:18:48.388 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:48.388 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:48.388 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:48.647 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:48.647 "name": "pt1", 00:18:48.647 "aliases": [ 00:18:48.647 "00000000-0000-0000-0000-000000000001" 00:18:48.647 ], 00:18:48.647 "product_name": "passthru", 00:18:48.647 "block_size": 512, 00:18:48.647 "num_blocks": 65536, 00:18:48.647 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:48.647 "assigned_rate_limits": { 00:18:48.647 "rw_ios_per_sec": 0, 00:18:48.647 "rw_mbytes_per_sec": 0, 00:18:48.647 "r_mbytes_per_sec": 0, 00:18:48.647 "w_mbytes_per_sec": 0 00:18:48.647 }, 00:18:48.647 "claimed": true, 00:18:48.647 "claim_type": "exclusive_write", 00:18:48.647 "zoned": false, 00:18:48.647 "supported_io_types": { 00:18:48.647 "read": true, 00:18:48.647 "write": true, 00:18:48.647 "unmap": true, 00:18:48.647 "flush": true, 00:18:48.647 "reset": true, 00:18:48.647 "nvme_admin": false, 00:18:48.647 "nvme_io": false, 00:18:48.647 "nvme_io_md": false, 00:18:48.647 "write_zeroes": true, 00:18:48.647 "zcopy": true, 00:18:48.647 "get_zone_info": false, 00:18:48.647 "zone_management": false, 00:18:48.647 "zone_append": false, 00:18:48.647 "compare": false, 00:18:48.647 "compare_and_write": false, 00:18:48.647 "abort": true, 00:18:48.647 "seek_hole": false, 00:18:48.647 "seek_data": false, 00:18:48.647 "copy": true, 00:18:48.647 "nvme_iov_md": false 00:18:48.647 }, 00:18:48.647 "memory_domains": [ 00:18:48.647 { 00:18:48.647 "dma_device_id": "system", 00:18:48.647 "dma_device_type": 1 00:18:48.647 }, 00:18:48.647 { 00:18:48.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.647 "dma_device_type": 2 00:18:48.647 } 00:18:48.647 ], 00:18:48.647 "driver_specific": { 00:18:48.647 "passthru": { 00:18:48.647 "name": "pt1", 00:18:48.647 "base_bdev_name": "malloc1" 00:18:48.647 } 00:18:48.647 } 00:18:48.647 }' 00:18:48.647 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.647 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.906 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:48.906 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.906 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.906 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:48.906 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.906 10:45:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.906 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:48.906 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.906 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.165 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:49.165 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:49.165 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:49.165 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:49.165 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:49.165 "name": "pt2", 00:18:49.165 "aliases": [ 00:18:49.165 "00000000-0000-0000-0000-000000000002" 00:18:49.165 ], 00:18:49.165 "product_name": "passthru", 00:18:49.165 "block_size": 512, 00:18:49.165 "num_blocks": 65536, 00:18:49.165 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:49.165 "assigned_rate_limits": { 00:18:49.165 "rw_ios_per_sec": 0, 00:18:49.166 "rw_mbytes_per_sec": 0, 00:18:49.166 "r_mbytes_per_sec": 0, 00:18:49.166 "w_mbytes_per_sec": 0 00:18:49.166 }, 00:18:49.166 "claimed": true, 00:18:49.166 "claim_type": "exclusive_write", 00:18:49.166 "zoned": false, 00:18:49.166 "supported_io_types": { 00:18:49.166 "read": true, 00:18:49.166 "write": true, 00:18:49.166 "unmap": true, 00:18:49.166 "flush": true, 00:18:49.166 "reset": true, 00:18:49.166 "nvme_admin": false, 00:18:49.166 "nvme_io": false, 00:18:49.166 "nvme_io_md": false, 00:18:49.166 "write_zeroes": true, 00:18:49.166 "zcopy": true, 00:18:49.166 "get_zone_info": false, 00:18:49.166 "zone_management": false, 00:18:49.166 "zone_append": false, 00:18:49.166 "compare": false, 00:18:49.166 "compare_and_write": false, 00:18:49.166 "abort": true, 00:18:49.166 "seek_hole": false, 00:18:49.166 "seek_data": false, 00:18:49.166 "copy": true, 00:18:49.166 "nvme_iov_md": false 00:18:49.166 }, 00:18:49.166 "memory_domains": [ 00:18:49.166 { 00:18:49.166 "dma_device_id": "system", 00:18:49.166 "dma_device_type": 1 00:18:49.166 }, 00:18:49.166 { 00:18:49.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:49.166 "dma_device_type": 2 00:18:49.166 } 00:18:49.166 ], 00:18:49.166 "driver_specific": { 00:18:49.166 "passthru": { 00:18:49.166 "name": "pt2", 00:18:49.166 "base_bdev_name": "malloc2" 00:18:49.166 } 00:18:49.166 } 00:18:49.166 }' 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:49.425 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.684 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.684 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:49.684 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:49.684 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:49.684 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:49.943 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:49.943 "name": "pt3", 00:18:49.943 "aliases": [ 00:18:49.943 "00000000-0000-0000-0000-000000000003" 00:18:49.943 ], 00:18:49.943 "product_name": "passthru", 00:18:49.943 "block_size": 512, 00:18:49.943 "num_blocks": 65536, 00:18:49.943 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:49.943 "assigned_rate_limits": { 00:18:49.943 "rw_ios_per_sec": 0, 00:18:49.943 "rw_mbytes_per_sec": 0, 00:18:49.943 "r_mbytes_per_sec": 0, 00:18:49.943 "w_mbytes_per_sec": 0 00:18:49.943 }, 00:18:49.943 "claimed": true, 00:18:49.943 "claim_type": "exclusive_write", 00:18:49.943 "zoned": false, 00:18:49.943 "supported_io_types": { 00:18:49.943 "read": true, 00:18:49.943 "write": true, 00:18:49.943 "unmap": true, 00:18:49.943 "flush": true, 00:18:49.943 "reset": true, 00:18:49.943 "nvme_admin": false, 00:18:49.943 "nvme_io": false, 00:18:49.943 "nvme_io_md": false, 00:18:49.943 "write_zeroes": true, 00:18:49.943 "zcopy": true, 00:18:49.943 "get_zone_info": false, 00:18:49.943 "zone_management": false, 00:18:49.943 "zone_append": false, 00:18:49.943 "compare": false, 00:18:49.943 "compare_and_write": false, 00:18:49.943 "abort": true, 00:18:49.943 "seek_hole": false, 00:18:49.943 "seek_data": false, 00:18:49.943 "copy": true, 00:18:49.943 "nvme_iov_md": false 00:18:49.943 }, 00:18:49.943 "memory_domains": [ 00:18:49.943 { 00:18:49.943 "dma_device_id": "system", 00:18:49.943 "dma_device_type": 1 00:18:49.943 }, 00:18:49.943 { 00:18:49.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:49.943 "dma_device_type": 2 00:18:49.943 } 00:18:49.943 ], 00:18:49.943 "driver_specific": { 00:18:49.943 "passthru": { 00:18:49.943 "name": "pt3", 00:18:49.943 "base_bdev_name": "malloc3" 00:18:49.943 } 00:18:49.943 } 00:18:49.943 }' 00:18:49.943 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.943 10:45:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.943 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:49.944 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.944 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.944 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:49.944 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:50.203 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:50.203 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:50.203 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:50.203 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:50.203 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:50.203 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:50.203 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:50.203 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:50.462 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:50.462 "name": "pt4", 00:18:50.462 "aliases": [ 00:18:50.462 "00000000-0000-0000-0000-000000000004" 00:18:50.462 ], 00:18:50.462 "product_name": "passthru", 00:18:50.462 "block_size": 512, 00:18:50.462 "num_blocks": 65536, 00:18:50.462 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:50.462 "assigned_rate_limits": { 00:18:50.462 "rw_ios_per_sec": 0, 00:18:50.462 "rw_mbytes_per_sec": 0, 00:18:50.462 "r_mbytes_per_sec": 0, 00:18:50.462 "w_mbytes_per_sec": 0 00:18:50.462 }, 00:18:50.462 "claimed": true, 00:18:50.462 "claim_type": "exclusive_write", 00:18:50.462 "zoned": false, 00:18:50.462 "supported_io_types": { 00:18:50.462 "read": true, 00:18:50.462 "write": true, 00:18:50.462 "unmap": true, 00:18:50.462 "flush": true, 00:18:50.462 "reset": true, 00:18:50.462 "nvme_admin": false, 00:18:50.462 "nvme_io": false, 00:18:50.462 "nvme_io_md": false, 00:18:50.462 "write_zeroes": true, 00:18:50.462 "zcopy": true, 00:18:50.462 "get_zone_info": false, 00:18:50.462 "zone_management": false, 00:18:50.462 "zone_append": false, 00:18:50.462 "compare": false, 00:18:50.462 "compare_and_write": false, 00:18:50.462 "abort": true, 00:18:50.462 "seek_hole": false, 00:18:50.462 "seek_data": false, 00:18:50.462 "copy": true, 00:18:50.462 "nvme_iov_md": false 00:18:50.462 }, 00:18:50.462 "memory_domains": [ 00:18:50.462 { 00:18:50.462 "dma_device_id": "system", 00:18:50.462 "dma_device_type": 1 00:18:50.462 }, 00:18:50.462 { 00:18:50.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.462 "dma_device_type": 2 00:18:50.462 } 00:18:50.462 ], 00:18:50.462 "driver_specific": { 00:18:50.462 "passthru": { 00:18:50.462 "name": "pt4", 00:18:50.462 "base_bdev_name": "malloc4" 00:18:50.462 } 00:18:50.462 } 00:18:50.462 }' 00:18:50.462 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:50.462 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:50.462 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:50.462 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:50.462 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:50.721 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:50.721 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:50.721 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:50.721 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:50.721 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:50.721 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:50.721 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:50.721 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:50.721 10:45:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:50.980 [2024-07-12 10:45:26.065509] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8f61e88e-322c-490c-83b0-387f52179d67 '!=' 8f61e88e-322c-490c-83b0-387f52179d67 ']' 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2085895 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2085895 ']' 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2085895 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2085895 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2085895' 00:18:50.980 killing process with pid 2085895 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2085895 00:18:50.980 [2024-07-12 10:45:26.136862] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:50.980 [2024-07-12 10:45:26.136923] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:50.980 [2024-07-12 10:45:26.136985] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:50.980 [2024-07-12 10:45:26.136997] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eb4780 name raid_bdev1, state offline 00:18:50.980 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2085895 00:18:50.980 [2024-07-12 10:45:26.173623] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:51.239 10:45:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:51.239 00:18:51.239 real 0m16.830s 00:18:51.239 user 0m30.421s 00:18:51.239 sys 0m3.002s 00:18:51.239 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:51.239 10:45:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.239 ************************************ 00:18:51.239 END TEST raid_superblock_test 00:18:51.239 ************************************ 00:18:51.239 10:45:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:51.239 10:45:26 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:18:51.239 10:45:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:51.239 10:45:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:51.239 10:45:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:51.498 ************************************ 00:18:51.498 START TEST raid_read_error_test 00:18:51.498 ************************************ 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RUOsf4Pd2W 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2088339 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2088339 /var/tmp/spdk-raid.sock 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2088339 ']' 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:51.498 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:51.498 10:45:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.498 [2024-07-12 10:45:26.533354] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:18:51.498 [2024-07-12 10:45:26.533421] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2088339 ] 00:18:51.498 [2024-07-12 10:45:26.652342] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:51.757 [2024-07-12 10:45:26.758719] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:51.757 [2024-07-12 10:45:26.827201] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:51.757 [2024-07-12 10:45:26.827238] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:52.320 10:45:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:52.320 10:45:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:52.320 10:45:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:52.320 10:45:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:52.577 BaseBdev1_malloc 00:18:52.577 10:45:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:52.844 true 00:18:52.844 10:45:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:53.133 [2024-07-12 10:45:28.102697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:53.134 [2024-07-12 10:45:28.102742] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:53.134 [2024-07-12 10:45:28.102762] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21f80d0 00:18:53.134 [2024-07-12 10:45:28.102775] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:53.134 [2024-07-12 10:45:28.104658] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:53.134 [2024-07-12 10:45:28.104688] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:53.134 BaseBdev1 00:18:53.134 10:45:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:53.134 10:45:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:53.408 BaseBdev2_malloc 00:18:53.408 10:45:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:53.408 true 00:18:53.665 10:45:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:53.665 [2024-07-12 10:45:28.825142] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:53.665 [2024-07-12 10:45:28.825186] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:53.665 [2024-07-12 10:45:28.825207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21fc910 00:18:53.665 [2024-07-12 10:45:28.825219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:53.665 [2024-07-12 10:45:28.826777] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:53.665 [2024-07-12 10:45:28.826804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:53.665 BaseBdev2 00:18:53.665 10:45:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:53.665 10:45:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:53.921 BaseBdev3_malloc 00:18:53.921 10:45:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:54.179 true 00:18:54.179 10:45:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:54.437 [2024-07-12 10:45:29.568886] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:54.437 [2024-07-12 10:45:29.568931] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:54.437 [2024-07-12 10:45:29.568952] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21febd0 00:18:54.437 [2024-07-12 10:45:29.568965] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:54.437 [2024-07-12 10:45:29.570569] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:54.437 [2024-07-12 10:45:29.570596] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:54.437 BaseBdev3 00:18:54.437 10:45:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:54.437 10:45:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:54.696 BaseBdev4_malloc 00:18:54.696 10:45:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:54.953 true 00:18:54.953 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:55.211 [2024-07-12 10:45:30.263274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:55.211 [2024-07-12 10:45:30.263316] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:55.211 [2024-07-12 10:45:30.263336] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21ffaa0 00:18:55.211 [2024-07-12 10:45:30.263349] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:55.211 [2024-07-12 10:45:30.264856] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:55.211 [2024-07-12 10:45:30.264883] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:55.211 BaseBdev4 00:18:55.211 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:55.469 [2024-07-12 10:45:30.519976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:55.469 [2024-07-12 10:45:30.521356] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:55.469 [2024-07-12 10:45:30.521426] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:55.469 [2024-07-12 10:45:30.521495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:55.469 [2024-07-12 10:45:30.521729] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21f9c20 00:18:55.469 [2024-07-12 10:45:30.521741] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:55.469 [2024-07-12 10:45:30.521942] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x204e260 00:18:55.469 [2024-07-12 10:45:30.522093] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21f9c20 00:18:55.469 [2024-07-12 10:45:30.522103] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21f9c20 00:18:55.469 [2024-07-12 10:45:30.522210] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.469 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:55.727 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:55.727 "name": "raid_bdev1", 00:18:55.727 "uuid": "93fe3f6b-9f04-4773-9b25-05c3bc1a9efc", 00:18:55.727 "strip_size_kb": 64, 00:18:55.727 "state": "online", 00:18:55.727 "raid_level": "raid0", 00:18:55.727 "superblock": true, 00:18:55.727 "num_base_bdevs": 4, 00:18:55.727 "num_base_bdevs_discovered": 4, 00:18:55.727 "num_base_bdevs_operational": 4, 00:18:55.727 "base_bdevs_list": [ 00:18:55.727 { 00:18:55.727 "name": "BaseBdev1", 00:18:55.727 "uuid": "6e5ef66c-1f7e-5922-b0f2-757a27b87106", 00:18:55.727 "is_configured": true, 00:18:55.727 "data_offset": 2048, 00:18:55.727 "data_size": 63488 00:18:55.727 }, 00:18:55.727 { 00:18:55.727 "name": "BaseBdev2", 00:18:55.727 "uuid": "7a2addf6-af22-5091-977c-8f58c0cff1bb", 00:18:55.727 "is_configured": true, 00:18:55.727 "data_offset": 2048, 00:18:55.727 "data_size": 63488 00:18:55.727 }, 00:18:55.727 { 00:18:55.727 "name": "BaseBdev3", 00:18:55.727 "uuid": "a2a19227-fd02-5452-86a6-23e7681065a4", 00:18:55.727 "is_configured": true, 00:18:55.727 "data_offset": 2048, 00:18:55.727 "data_size": 63488 00:18:55.727 }, 00:18:55.727 { 00:18:55.727 "name": "BaseBdev4", 00:18:55.727 "uuid": "f587f28b-19d8-5628-8415-14e38faa1c64", 00:18:55.727 "is_configured": true, 00:18:55.727 "data_offset": 2048, 00:18:55.727 "data_size": 63488 00:18:55.727 } 00:18:55.727 ] 00:18:55.727 }' 00:18:55.727 10:45:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:55.727 10:45:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.293 10:45:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:56.293 10:45:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:56.293 [2024-07-12 10:45:31.458739] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21ebfc0 00:18:57.227 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.484 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:57.742 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:57.742 "name": "raid_bdev1", 00:18:57.742 "uuid": "93fe3f6b-9f04-4773-9b25-05c3bc1a9efc", 00:18:57.742 "strip_size_kb": 64, 00:18:57.742 "state": "online", 00:18:57.742 "raid_level": "raid0", 00:18:57.742 "superblock": true, 00:18:57.742 "num_base_bdevs": 4, 00:18:57.742 "num_base_bdevs_discovered": 4, 00:18:57.742 "num_base_bdevs_operational": 4, 00:18:57.742 "base_bdevs_list": [ 00:18:57.742 { 00:18:57.742 "name": "BaseBdev1", 00:18:57.742 "uuid": "6e5ef66c-1f7e-5922-b0f2-757a27b87106", 00:18:57.742 "is_configured": true, 00:18:57.742 "data_offset": 2048, 00:18:57.742 "data_size": 63488 00:18:57.742 }, 00:18:57.742 { 00:18:57.742 "name": "BaseBdev2", 00:18:57.742 "uuid": "7a2addf6-af22-5091-977c-8f58c0cff1bb", 00:18:57.742 "is_configured": true, 00:18:57.742 "data_offset": 2048, 00:18:57.742 "data_size": 63488 00:18:57.742 }, 00:18:57.742 { 00:18:57.742 "name": "BaseBdev3", 00:18:57.742 "uuid": "a2a19227-fd02-5452-86a6-23e7681065a4", 00:18:57.742 "is_configured": true, 00:18:57.742 "data_offset": 2048, 00:18:57.742 "data_size": 63488 00:18:57.742 }, 00:18:57.742 { 00:18:57.742 "name": "BaseBdev4", 00:18:57.742 "uuid": "f587f28b-19d8-5628-8415-14e38faa1c64", 00:18:57.742 "is_configured": true, 00:18:57.742 "data_offset": 2048, 00:18:57.742 "data_size": 63488 00:18:57.742 } 00:18:57.742 ] 00:18:57.742 }' 00:18:57.742 10:45:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:57.742 10:45:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.309 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:58.568 [2024-07-12 10:45:33.611696] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:58.568 [2024-07-12 10:45:33.611731] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:58.568 [2024-07-12 10:45:33.614893] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:58.568 [2024-07-12 10:45:33.614931] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:58.568 [2024-07-12 10:45:33.614972] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:58.568 [2024-07-12 10:45:33.614984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21f9c20 name raid_bdev1, state offline 00:18:58.568 0 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2088339 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2088339 ']' 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2088339 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2088339 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2088339' 00:18:58.568 killing process with pid 2088339 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2088339 00:18:58.568 [2024-07-12 10:45:33.682100] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:58.568 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2088339 00:18:58.568 [2024-07-12 10:45:33.713905] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RUOsf4Pd2W 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:18:58.827 00:18:58.827 real 0m7.496s 00:18:58.827 user 0m11.929s 00:18:58.827 sys 0m1.323s 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:58.827 10:45:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.827 ************************************ 00:18:58.827 END TEST raid_read_error_test 00:18:58.827 ************************************ 00:18:58.827 10:45:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:58.827 10:45:33 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:18:58.827 10:45:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:58.827 10:45:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:58.827 10:45:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:59.087 ************************************ 00:18:59.087 START TEST raid_write_error_test 00:18:59.087 ************************************ 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.7nhHrrZtsw 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2089498 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2089498 /var/tmp/spdk-raid.sock 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2089498 ']' 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:59.087 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:59.087 10:45:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:59.087 [2024-07-12 10:45:34.118367] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:18:59.087 [2024-07-12 10:45:34.118442] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2089498 ] 00:18:59.087 [2024-07-12 10:45:34.251566] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:59.346 [2024-07-12 10:45:34.356100] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:59.346 [2024-07-12 10:45:34.419907] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:59.346 [2024-07-12 10:45:34.419945] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:59.914 10:45:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:59.914 10:45:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:59.914 10:45:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:59.914 10:45:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:00.173 BaseBdev1_malloc 00:19:00.173 10:45:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:00.431 true 00:19:00.431 10:45:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:00.690 [2024-07-12 10:45:35.771290] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:00.690 [2024-07-12 10:45:35.771341] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:00.690 [2024-07-12 10:45:35.771361] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26540d0 00:19:00.690 [2024-07-12 10:45:35.771374] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:00.690 [2024-07-12 10:45:35.773176] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:00.690 [2024-07-12 10:45:35.773207] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:00.690 BaseBdev1 00:19:00.690 10:45:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:00.690 10:45:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:00.948 BaseBdev2_malloc 00:19:00.948 10:45:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:01.207 true 00:19:01.207 10:45:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:01.466 [2024-07-12 10:45:36.429752] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:01.466 [2024-07-12 10:45:36.429798] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:01.466 [2024-07-12 10:45:36.429818] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2658910 00:19:01.466 [2024-07-12 10:45:36.429831] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:01.466 [2024-07-12 10:45:36.431253] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:01.466 [2024-07-12 10:45:36.431281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:01.466 BaseBdev2 00:19:01.466 10:45:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:01.466 10:45:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:01.726 BaseBdev3_malloc 00:19:01.726 10:45:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:01.726 true 00:19:01.985 10:45:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:01.985 [2024-07-12 10:45:37.144175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:01.985 [2024-07-12 10:45:37.144221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:01.985 [2024-07-12 10:45:37.144240] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265abd0 00:19:01.985 [2024-07-12 10:45:37.144253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:01.985 [2024-07-12 10:45:37.145705] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:01.985 [2024-07-12 10:45:37.145733] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:01.985 BaseBdev3 00:19:01.985 10:45:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:01.985 10:45:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:02.245 BaseBdev4_malloc 00:19:02.245 10:45:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:02.503 true 00:19:02.503 10:45:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:02.762 [2024-07-12 10:45:37.878807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:02.762 [2024-07-12 10:45:37.878855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:02.762 [2024-07-12 10:45:37.878875] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265baa0 00:19:02.762 [2024-07-12 10:45:37.878888] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:02.762 [2024-07-12 10:45:37.880385] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:02.762 [2024-07-12 10:45:37.880413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:02.762 BaseBdev4 00:19:02.762 10:45:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:03.022 [2024-07-12 10:45:38.127497] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:03.022 [2024-07-12 10:45:38.128686] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:03.022 [2024-07-12 10:45:38.128754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:03.022 [2024-07-12 10:45:38.128815] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:03.022 [2024-07-12 10:45:38.129038] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2655c20 00:19:03.022 [2024-07-12 10:45:38.129050] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:03.022 [2024-07-12 10:45:38.129237] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24aa260 00:19:03.022 [2024-07-12 10:45:38.129381] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2655c20 00:19:03.022 [2024-07-12 10:45:38.129390] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2655c20 00:19:03.022 [2024-07-12 10:45:38.129503] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.022 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:03.280 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.280 "name": "raid_bdev1", 00:19:03.280 "uuid": "011edfbf-10c6-4539-8ce0-a0cc2ff3b9c8", 00:19:03.280 "strip_size_kb": 64, 00:19:03.281 "state": "online", 00:19:03.281 "raid_level": "raid0", 00:19:03.281 "superblock": true, 00:19:03.281 "num_base_bdevs": 4, 00:19:03.281 "num_base_bdevs_discovered": 4, 00:19:03.281 "num_base_bdevs_operational": 4, 00:19:03.281 "base_bdevs_list": [ 00:19:03.281 { 00:19:03.281 "name": "BaseBdev1", 00:19:03.281 "uuid": "5e5e74a6-213c-53d4-87cc-ac5437c7ae31", 00:19:03.281 "is_configured": true, 00:19:03.281 "data_offset": 2048, 00:19:03.281 "data_size": 63488 00:19:03.281 }, 00:19:03.281 { 00:19:03.281 "name": "BaseBdev2", 00:19:03.281 "uuid": "42c06e6f-b2dd-577b-99d8-46ccc4090d1b", 00:19:03.281 "is_configured": true, 00:19:03.281 "data_offset": 2048, 00:19:03.281 "data_size": 63488 00:19:03.281 }, 00:19:03.281 { 00:19:03.281 "name": "BaseBdev3", 00:19:03.281 "uuid": "78350725-0727-5214-8200-024b3318549e", 00:19:03.281 "is_configured": true, 00:19:03.281 "data_offset": 2048, 00:19:03.281 "data_size": 63488 00:19:03.281 }, 00:19:03.281 { 00:19:03.281 "name": "BaseBdev4", 00:19:03.281 "uuid": "138cb7ec-de6c-565c-abed-a4794dbdce71", 00:19:03.281 "is_configured": true, 00:19:03.281 "data_offset": 2048, 00:19:03.281 "data_size": 63488 00:19:03.281 } 00:19:03.281 ] 00:19:03.281 }' 00:19:03.281 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.281 10:45:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:03.847 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:03.848 10:45:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:04.106 [2024-07-12 10:45:39.086322] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2647fc0 00:19:05.042 10:45:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:05.301 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.301 "name": "raid_bdev1", 00:19:05.301 "uuid": "011edfbf-10c6-4539-8ce0-a0cc2ff3b9c8", 00:19:05.301 "strip_size_kb": 64, 00:19:05.301 "state": "online", 00:19:05.301 "raid_level": "raid0", 00:19:05.301 "superblock": true, 00:19:05.301 "num_base_bdevs": 4, 00:19:05.301 "num_base_bdevs_discovered": 4, 00:19:05.301 "num_base_bdevs_operational": 4, 00:19:05.301 "base_bdevs_list": [ 00:19:05.301 { 00:19:05.301 "name": "BaseBdev1", 00:19:05.301 "uuid": "5e5e74a6-213c-53d4-87cc-ac5437c7ae31", 00:19:05.301 "is_configured": true, 00:19:05.301 "data_offset": 2048, 00:19:05.301 "data_size": 63488 00:19:05.301 }, 00:19:05.301 { 00:19:05.301 "name": "BaseBdev2", 00:19:05.301 "uuid": "42c06e6f-b2dd-577b-99d8-46ccc4090d1b", 00:19:05.301 "is_configured": true, 00:19:05.302 "data_offset": 2048, 00:19:05.302 "data_size": 63488 00:19:05.302 }, 00:19:05.302 { 00:19:05.302 "name": "BaseBdev3", 00:19:05.302 "uuid": "78350725-0727-5214-8200-024b3318549e", 00:19:05.302 "is_configured": true, 00:19:05.302 "data_offset": 2048, 00:19:05.302 "data_size": 63488 00:19:05.302 }, 00:19:05.302 { 00:19:05.302 "name": "BaseBdev4", 00:19:05.302 "uuid": "138cb7ec-de6c-565c-abed-a4794dbdce71", 00:19:05.302 "is_configured": true, 00:19:05.302 "data_offset": 2048, 00:19:05.302 "data_size": 63488 00:19:05.302 } 00:19:05.302 ] 00:19:05.302 }' 00:19:05.302 10:45:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.302 10:45:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:06.239 [2024-07-12 10:45:41.235201] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:06.239 [2024-07-12 10:45:41.235249] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:06.239 [2024-07-12 10:45:41.238417] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:06.239 [2024-07-12 10:45:41.238456] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:06.239 [2024-07-12 10:45:41.238502] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:06.239 [2024-07-12 10:45:41.238514] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2655c20 name raid_bdev1, state offline 00:19:06.239 0 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2089498 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2089498 ']' 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2089498 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2089498 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2089498' 00:19:06.239 killing process with pid 2089498 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2089498 00:19:06.239 [2024-07-12 10:45:41.301134] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:06.239 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2089498 00:19:06.239 [2024-07-12 10:45:41.332658] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.7nhHrrZtsw 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:19:06.498 00:19:06.498 real 0m7.535s 00:19:06.498 user 0m12.009s 00:19:06.498 sys 0m1.354s 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:06.498 10:45:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.498 ************************************ 00:19:06.498 END TEST raid_write_error_test 00:19:06.498 ************************************ 00:19:06.498 10:45:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:06.498 10:45:41 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:06.498 10:45:41 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:19:06.498 10:45:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:06.498 10:45:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:06.498 10:45:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:06.498 ************************************ 00:19:06.498 START TEST raid_state_function_test 00:19:06.498 ************************************ 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:06.498 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2090615 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2090615' 00:19:06.499 Process raid pid: 2090615 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2090615 /var/tmp/spdk-raid.sock 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2090615 ']' 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:06.499 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:06.499 10:45:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.757 [2024-07-12 10:45:41.722734] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:19:06.757 [2024-07-12 10:45:41.722796] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:06.757 [2024-07-12 10:45:41.852844] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:07.015 [2024-07-12 10:45:41.959284] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:07.015 [2024-07-12 10:45:42.031663] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:07.015 [2024-07-12 10:45:42.031693] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:07.583 10:45:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:07.583 10:45:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:07.583 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:07.842 [2024-07-12 10:45:42.879833] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:07.842 [2024-07-12 10:45:42.879875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:07.842 [2024-07-12 10:45:42.879886] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:07.842 [2024-07-12 10:45:42.879898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:07.842 [2024-07-12 10:45:42.879906] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:07.842 [2024-07-12 10:45:42.879917] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:07.842 [2024-07-12 10:45:42.879926] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:07.842 [2024-07-12 10:45:42.879937] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:07.842 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:07.842 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:07.842 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.842 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:07.842 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:07.843 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:07.843 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.843 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.843 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.843 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.843 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.843 10:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:08.101 10:45:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.101 "name": "Existed_Raid", 00:19:08.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.101 "strip_size_kb": 64, 00:19:08.101 "state": "configuring", 00:19:08.101 "raid_level": "concat", 00:19:08.101 "superblock": false, 00:19:08.101 "num_base_bdevs": 4, 00:19:08.101 "num_base_bdevs_discovered": 0, 00:19:08.101 "num_base_bdevs_operational": 4, 00:19:08.101 "base_bdevs_list": [ 00:19:08.101 { 00:19:08.101 "name": "BaseBdev1", 00:19:08.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.101 "is_configured": false, 00:19:08.101 "data_offset": 0, 00:19:08.101 "data_size": 0 00:19:08.101 }, 00:19:08.101 { 00:19:08.101 "name": "BaseBdev2", 00:19:08.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.101 "is_configured": false, 00:19:08.101 "data_offset": 0, 00:19:08.101 "data_size": 0 00:19:08.101 }, 00:19:08.101 { 00:19:08.101 "name": "BaseBdev3", 00:19:08.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.101 "is_configured": false, 00:19:08.101 "data_offset": 0, 00:19:08.101 "data_size": 0 00:19:08.101 }, 00:19:08.101 { 00:19:08.101 "name": "BaseBdev4", 00:19:08.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.101 "is_configured": false, 00:19:08.101 "data_offset": 0, 00:19:08.101 "data_size": 0 00:19:08.101 } 00:19:08.101 ] 00:19:08.101 }' 00:19:08.101 10:45:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.101 10:45:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.667 10:45:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:08.926 [2024-07-12 10:45:43.962622] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:08.926 [2024-07-12 10:45:43.962653] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1761aa0 name Existed_Raid, state configuring 00:19:08.926 10:45:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:09.185 [2024-07-12 10:45:44.207291] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:09.185 [2024-07-12 10:45:44.207326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:09.185 [2024-07-12 10:45:44.207336] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:09.185 [2024-07-12 10:45:44.207347] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:09.185 [2024-07-12 10:45:44.207356] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:09.185 [2024-07-12 10:45:44.207367] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:09.185 [2024-07-12 10:45:44.207376] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:09.185 [2024-07-12 10:45:44.207386] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:09.185 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:09.444 [2024-07-12 10:45:44.461875] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:09.444 BaseBdev1 00:19:09.444 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:09.444 10:45:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:09.444 10:45:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:09.444 10:45:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:09.444 10:45:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:09.444 10:45:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:09.444 10:45:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:09.752 10:45:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:10.011 [ 00:19:10.011 { 00:19:10.011 "name": "BaseBdev1", 00:19:10.011 "aliases": [ 00:19:10.011 "35e41842-9c62-4798-acbb-e52e90f7018e" 00:19:10.011 ], 00:19:10.011 "product_name": "Malloc disk", 00:19:10.011 "block_size": 512, 00:19:10.011 "num_blocks": 65536, 00:19:10.011 "uuid": "35e41842-9c62-4798-acbb-e52e90f7018e", 00:19:10.011 "assigned_rate_limits": { 00:19:10.011 "rw_ios_per_sec": 0, 00:19:10.011 "rw_mbytes_per_sec": 0, 00:19:10.011 "r_mbytes_per_sec": 0, 00:19:10.011 "w_mbytes_per_sec": 0 00:19:10.011 }, 00:19:10.011 "claimed": true, 00:19:10.011 "claim_type": "exclusive_write", 00:19:10.011 "zoned": false, 00:19:10.011 "supported_io_types": { 00:19:10.011 "read": true, 00:19:10.011 "write": true, 00:19:10.011 "unmap": true, 00:19:10.011 "flush": true, 00:19:10.011 "reset": true, 00:19:10.011 "nvme_admin": false, 00:19:10.011 "nvme_io": false, 00:19:10.011 "nvme_io_md": false, 00:19:10.011 "write_zeroes": true, 00:19:10.011 "zcopy": true, 00:19:10.011 "get_zone_info": false, 00:19:10.011 "zone_management": false, 00:19:10.011 "zone_append": false, 00:19:10.011 "compare": false, 00:19:10.011 "compare_and_write": false, 00:19:10.011 "abort": true, 00:19:10.011 "seek_hole": false, 00:19:10.011 "seek_data": false, 00:19:10.011 "copy": true, 00:19:10.011 "nvme_iov_md": false 00:19:10.011 }, 00:19:10.011 "memory_domains": [ 00:19:10.011 { 00:19:10.011 "dma_device_id": "system", 00:19:10.011 "dma_device_type": 1 00:19:10.011 }, 00:19:10.011 { 00:19:10.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.011 "dma_device_type": 2 00:19:10.011 } 00:19:10.011 ], 00:19:10.011 "driver_specific": {} 00:19:10.011 } 00:19:10.011 ] 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.011 10:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:10.271 10:45:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:10.271 "name": "Existed_Raid", 00:19:10.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.271 "strip_size_kb": 64, 00:19:10.271 "state": "configuring", 00:19:10.271 "raid_level": "concat", 00:19:10.271 "superblock": false, 00:19:10.271 "num_base_bdevs": 4, 00:19:10.271 "num_base_bdevs_discovered": 1, 00:19:10.271 "num_base_bdevs_operational": 4, 00:19:10.271 "base_bdevs_list": [ 00:19:10.271 { 00:19:10.271 "name": "BaseBdev1", 00:19:10.271 "uuid": "35e41842-9c62-4798-acbb-e52e90f7018e", 00:19:10.271 "is_configured": true, 00:19:10.271 "data_offset": 0, 00:19:10.271 "data_size": 65536 00:19:10.271 }, 00:19:10.271 { 00:19:10.271 "name": "BaseBdev2", 00:19:10.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.271 "is_configured": false, 00:19:10.271 "data_offset": 0, 00:19:10.271 "data_size": 0 00:19:10.271 }, 00:19:10.271 { 00:19:10.271 "name": "BaseBdev3", 00:19:10.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.271 "is_configured": false, 00:19:10.271 "data_offset": 0, 00:19:10.271 "data_size": 0 00:19:10.271 }, 00:19:10.271 { 00:19:10.271 "name": "BaseBdev4", 00:19:10.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.271 "is_configured": false, 00:19:10.271 "data_offset": 0, 00:19:10.271 "data_size": 0 00:19:10.271 } 00:19:10.271 ] 00:19:10.271 }' 00:19:10.271 10:45:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:10.271 10:45:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.838 10:45:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:10.838 [2024-07-12 10:45:46.030034] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:10.838 [2024-07-12 10:45:46.030074] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1761310 name Existed_Raid, state configuring 00:19:11.096 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:11.096 [2024-07-12 10:45:46.274719] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:11.096 [2024-07-12 10:45:46.276161] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:11.096 [2024-07-12 10:45:46.276194] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:11.096 [2024-07-12 10:45:46.276204] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:11.096 [2024-07-12 10:45:46.276216] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:11.096 [2024-07-12 10:45:46.276225] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:11.096 [2024-07-12 10:45:46.276236] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.355 "name": "Existed_Raid", 00:19:11.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.355 "strip_size_kb": 64, 00:19:11.355 "state": "configuring", 00:19:11.355 "raid_level": "concat", 00:19:11.355 "superblock": false, 00:19:11.355 "num_base_bdevs": 4, 00:19:11.355 "num_base_bdevs_discovered": 1, 00:19:11.355 "num_base_bdevs_operational": 4, 00:19:11.355 "base_bdevs_list": [ 00:19:11.355 { 00:19:11.355 "name": "BaseBdev1", 00:19:11.355 "uuid": "35e41842-9c62-4798-acbb-e52e90f7018e", 00:19:11.355 "is_configured": true, 00:19:11.355 "data_offset": 0, 00:19:11.355 "data_size": 65536 00:19:11.355 }, 00:19:11.355 { 00:19:11.355 "name": "BaseBdev2", 00:19:11.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.355 "is_configured": false, 00:19:11.355 "data_offset": 0, 00:19:11.355 "data_size": 0 00:19:11.355 }, 00:19:11.355 { 00:19:11.355 "name": "BaseBdev3", 00:19:11.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.355 "is_configured": false, 00:19:11.355 "data_offset": 0, 00:19:11.355 "data_size": 0 00:19:11.355 }, 00:19:11.355 { 00:19:11.355 "name": "BaseBdev4", 00:19:11.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.355 "is_configured": false, 00:19:11.355 "data_offset": 0, 00:19:11.355 "data_size": 0 00:19:11.355 } 00:19:11.355 ] 00:19:11.355 }' 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.355 10:45:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.287 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:12.287 [2024-07-12 10:45:47.429181] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:12.287 BaseBdev2 00:19:12.287 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:12.287 10:45:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:12.287 10:45:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:12.287 10:45:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:12.287 10:45:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:12.287 10:45:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:12.287 10:45:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:12.545 10:45:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:12.802 [ 00:19:12.802 { 00:19:12.802 "name": "BaseBdev2", 00:19:12.802 "aliases": [ 00:19:12.802 "908c9e3c-8c6b-4f98-ba12-a15ade966e26" 00:19:12.802 ], 00:19:12.802 "product_name": "Malloc disk", 00:19:12.802 "block_size": 512, 00:19:12.802 "num_blocks": 65536, 00:19:12.802 "uuid": "908c9e3c-8c6b-4f98-ba12-a15ade966e26", 00:19:12.802 "assigned_rate_limits": { 00:19:12.802 "rw_ios_per_sec": 0, 00:19:12.802 "rw_mbytes_per_sec": 0, 00:19:12.802 "r_mbytes_per_sec": 0, 00:19:12.802 "w_mbytes_per_sec": 0 00:19:12.802 }, 00:19:12.802 "claimed": true, 00:19:12.802 "claim_type": "exclusive_write", 00:19:12.802 "zoned": false, 00:19:12.802 "supported_io_types": { 00:19:12.802 "read": true, 00:19:12.802 "write": true, 00:19:12.802 "unmap": true, 00:19:12.802 "flush": true, 00:19:12.802 "reset": true, 00:19:12.802 "nvme_admin": false, 00:19:12.802 "nvme_io": false, 00:19:12.802 "nvme_io_md": false, 00:19:12.802 "write_zeroes": true, 00:19:12.802 "zcopy": true, 00:19:12.802 "get_zone_info": false, 00:19:12.802 "zone_management": false, 00:19:12.802 "zone_append": false, 00:19:12.802 "compare": false, 00:19:12.802 "compare_and_write": false, 00:19:12.802 "abort": true, 00:19:12.803 "seek_hole": false, 00:19:12.803 "seek_data": false, 00:19:12.803 "copy": true, 00:19:12.803 "nvme_iov_md": false 00:19:12.803 }, 00:19:12.803 "memory_domains": [ 00:19:12.803 { 00:19:12.803 "dma_device_id": "system", 00:19:12.803 "dma_device_type": 1 00:19:12.803 }, 00:19:12.803 { 00:19:12.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.803 "dma_device_type": 2 00:19:12.803 } 00:19:12.803 ], 00:19:12.803 "driver_specific": {} 00:19:12.803 } 00:19:12.803 ] 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:12.803 10:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.060 10:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:13.060 "name": "Existed_Raid", 00:19:13.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.060 "strip_size_kb": 64, 00:19:13.060 "state": "configuring", 00:19:13.060 "raid_level": "concat", 00:19:13.060 "superblock": false, 00:19:13.060 "num_base_bdevs": 4, 00:19:13.060 "num_base_bdevs_discovered": 2, 00:19:13.060 "num_base_bdevs_operational": 4, 00:19:13.060 "base_bdevs_list": [ 00:19:13.060 { 00:19:13.060 "name": "BaseBdev1", 00:19:13.060 "uuid": "35e41842-9c62-4798-acbb-e52e90f7018e", 00:19:13.061 "is_configured": true, 00:19:13.061 "data_offset": 0, 00:19:13.061 "data_size": 65536 00:19:13.061 }, 00:19:13.061 { 00:19:13.061 "name": "BaseBdev2", 00:19:13.061 "uuid": "908c9e3c-8c6b-4f98-ba12-a15ade966e26", 00:19:13.061 "is_configured": true, 00:19:13.061 "data_offset": 0, 00:19:13.061 "data_size": 65536 00:19:13.061 }, 00:19:13.061 { 00:19:13.061 "name": "BaseBdev3", 00:19:13.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.061 "is_configured": false, 00:19:13.061 "data_offset": 0, 00:19:13.061 "data_size": 0 00:19:13.061 }, 00:19:13.061 { 00:19:13.061 "name": "BaseBdev4", 00:19:13.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.061 "is_configured": false, 00:19:13.061 "data_offset": 0, 00:19:13.061 "data_size": 0 00:19:13.061 } 00:19:13.061 ] 00:19:13.061 }' 00:19:13.061 10:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:13.061 10:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:13.625 10:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:13.883 [2024-07-12 10:45:48.996714] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:13.883 BaseBdev3 00:19:13.883 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:13.883 10:45:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:13.883 10:45:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:13.883 10:45:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:13.883 10:45:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:13.883 10:45:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:13.883 10:45:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:14.141 10:45:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:14.397 [ 00:19:14.398 { 00:19:14.398 "name": "BaseBdev3", 00:19:14.398 "aliases": [ 00:19:14.398 "cbfc6348-d740-419b-9d4c-f7a76cfa360b" 00:19:14.398 ], 00:19:14.398 "product_name": "Malloc disk", 00:19:14.398 "block_size": 512, 00:19:14.398 "num_blocks": 65536, 00:19:14.398 "uuid": "cbfc6348-d740-419b-9d4c-f7a76cfa360b", 00:19:14.398 "assigned_rate_limits": { 00:19:14.398 "rw_ios_per_sec": 0, 00:19:14.398 "rw_mbytes_per_sec": 0, 00:19:14.398 "r_mbytes_per_sec": 0, 00:19:14.398 "w_mbytes_per_sec": 0 00:19:14.398 }, 00:19:14.398 "claimed": true, 00:19:14.398 "claim_type": "exclusive_write", 00:19:14.398 "zoned": false, 00:19:14.398 "supported_io_types": { 00:19:14.398 "read": true, 00:19:14.398 "write": true, 00:19:14.398 "unmap": true, 00:19:14.398 "flush": true, 00:19:14.398 "reset": true, 00:19:14.398 "nvme_admin": false, 00:19:14.398 "nvme_io": false, 00:19:14.398 "nvme_io_md": false, 00:19:14.398 "write_zeroes": true, 00:19:14.398 "zcopy": true, 00:19:14.398 "get_zone_info": false, 00:19:14.398 "zone_management": false, 00:19:14.398 "zone_append": false, 00:19:14.398 "compare": false, 00:19:14.398 "compare_and_write": false, 00:19:14.398 "abort": true, 00:19:14.398 "seek_hole": false, 00:19:14.398 "seek_data": false, 00:19:14.398 "copy": true, 00:19:14.398 "nvme_iov_md": false 00:19:14.398 }, 00:19:14.398 "memory_domains": [ 00:19:14.398 { 00:19:14.398 "dma_device_id": "system", 00:19:14.398 "dma_device_type": 1 00:19:14.398 }, 00:19:14.398 { 00:19:14.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:14.398 "dma_device_type": 2 00:19:14.398 } 00:19:14.398 ], 00:19:14.398 "driver_specific": {} 00:19:14.398 } 00:19:14.398 ] 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.398 "name": "Existed_Raid", 00:19:14.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.398 "strip_size_kb": 64, 00:19:14.398 "state": "configuring", 00:19:14.398 "raid_level": "concat", 00:19:14.398 "superblock": false, 00:19:14.398 "num_base_bdevs": 4, 00:19:14.398 "num_base_bdevs_discovered": 3, 00:19:14.398 "num_base_bdevs_operational": 4, 00:19:14.398 "base_bdevs_list": [ 00:19:14.398 { 00:19:14.398 "name": "BaseBdev1", 00:19:14.398 "uuid": "35e41842-9c62-4798-acbb-e52e90f7018e", 00:19:14.398 "is_configured": true, 00:19:14.398 "data_offset": 0, 00:19:14.398 "data_size": 65536 00:19:14.398 }, 00:19:14.398 { 00:19:14.398 "name": "BaseBdev2", 00:19:14.398 "uuid": "908c9e3c-8c6b-4f98-ba12-a15ade966e26", 00:19:14.398 "is_configured": true, 00:19:14.398 "data_offset": 0, 00:19:14.398 "data_size": 65536 00:19:14.398 }, 00:19:14.398 { 00:19:14.398 "name": "BaseBdev3", 00:19:14.398 "uuid": "cbfc6348-d740-419b-9d4c-f7a76cfa360b", 00:19:14.398 "is_configured": true, 00:19:14.398 "data_offset": 0, 00:19:14.398 "data_size": 65536 00:19:14.398 }, 00:19:14.398 { 00:19:14.398 "name": "BaseBdev4", 00:19:14.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.398 "is_configured": false, 00:19:14.398 "data_offset": 0, 00:19:14.398 "data_size": 0 00:19:14.398 } 00:19:14.398 ] 00:19:14.398 }' 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.398 10:45:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.961 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:15.218 [2024-07-12 10:45:50.311712] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:15.218 [2024-07-12 10:45:50.311751] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1762350 00:19:15.218 [2024-07-12 10:45:50.311760] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:15.218 [2024-07-12 10:45:50.312016] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1762020 00:19:15.218 [2024-07-12 10:45:50.312137] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1762350 00:19:15.218 [2024-07-12 10:45:50.312146] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1762350 00:19:15.218 [2024-07-12 10:45:50.312304] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:15.218 BaseBdev4 00:19:15.218 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:15.218 10:45:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:15.218 10:45:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:15.218 10:45:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:15.218 10:45:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:15.218 10:45:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:15.218 10:45:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:15.475 10:45:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:15.475 [ 00:19:15.475 { 00:19:15.475 "name": "BaseBdev4", 00:19:15.475 "aliases": [ 00:19:15.475 "a2393e9c-6442-4e0e-b78c-05a85390fc45" 00:19:15.475 ], 00:19:15.475 "product_name": "Malloc disk", 00:19:15.475 "block_size": 512, 00:19:15.475 "num_blocks": 65536, 00:19:15.475 "uuid": "a2393e9c-6442-4e0e-b78c-05a85390fc45", 00:19:15.475 "assigned_rate_limits": { 00:19:15.475 "rw_ios_per_sec": 0, 00:19:15.475 "rw_mbytes_per_sec": 0, 00:19:15.475 "r_mbytes_per_sec": 0, 00:19:15.475 "w_mbytes_per_sec": 0 00:19:15.475 }, 00:19:15.475 "claimed": true, 00:19:15.475 "claim_type": "exclusive_write", 00:19:15.475 "zoned": false, 00:19:15.475 "supported_io_types": { 00:19:15.475 "read": true, 00:19:15.475 "write": true, 00:19:15.475 "unmap": true, 00:19:15.475 "flush": true, 00:19:15.475 "reset": true, 00:19:15.475 "nvme_admin": false, 00:19:15.475 "nvme_io": false, 00:19:15.475 "nvme_io_md": false, 00:19:15.475 "write_zeroes": true, 00:19:15.475 "zcopy": true, 00:19:15.475 "get_zone_info": false, 00:19:15.475 "zone_management": false, 00:19:15.475 "zone_append": false, 00:19:15.475 "compare": false, 00:19:15.475 "compare_and_write": false, 00:19:15.475 "abort": true, 00:19:15.475 "seek_hole": false, 00:19:15.475 "seek_data": false, 00:19:15.475 "copy": true, 00:19:15.475 "nvme_iov_md": false 00:19:15.475 }, 00:19:15.475 "memory_domains": [ 00:19:15.475 { 00:19:15.475 "dma_device_id": "system", 00:19:15.475 "dma_device_type": 1 00:19:15.475 }, 00:19:15.475 { 00:19:15.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.475 "dma_device_type": 2 00:19:15.475 } 00:19:15.475 ], 00:19:15.475 "driver_specific": {} 00:19:15.475 } 00:19:15.475 ] 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.733 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.992 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.992 "name": "Existed_Raid", 00:19:15.992 "uuid": "a30e2e72-2749-47ed-b8ef-87d95ac481bb", 00:19:15.992 "strip_size_kb": 64, 00:19:15.992 "state": "online", 00:19:15.992 "raid_level": "concat", 00:19:15.992 "superblock": false, 00:19:15.992 "num_base_bdevs": 4, 00:19:15.992 "num_base_bdevs_discovered": 4, 00:19:15.992 "num_base_bdevs_operational": 4, 00:19:15.992 "base_bdevs_list": [ 00:19:15.992 { 00:19:15.992 "name": "BaseBdev1", 00:19:15.992 "uuid": "35e41842-9c62-4798-acbb-e52e90f7018e", 00:19:15.992 "is_configured": true, 00:19:15.992 "data_offset": 0, 00:19:15.992 "data_size": 65536 00:19:15.992 }, 00:19:15.992 { 00:19:15.992 "name": "BaseBdev2", 00:19:15.992 "uuid": "908c9e3c-8c6b-4f98-ba12-a15ade966e26", 00:19:15.992 "is_configured": true, 00:19:15.992 "data_offset": 0, 00:19:15.992 "data_size": 65536 00:19:15.992 }, 00:19:15.992 { 00:19:15.992 "name": "BaseBdev3", 00:19:15.992 "uuid": "cbfc6348-d740-419b-9d4c-f7a76cfa360b", 00:19:15.992 "is_configured": true, 00:19:15.992 "data_offset": 0, 00:19:15.992 "data_size": 65536 00:19:15.992 }, 00:19:15.992 { 00:19:15.992 "name": "BaseBdev4", 00:19:15.992 "uuid": "a2393e9c-6442-4e0e-b78c-05a85390fc45", 00:19:15.992 "is_configured": true, 00:19:15.992 "data_offset": 0, 00:19:15.992 "data_size": 65536 00:19:15.992 } 00:19:15.992 ] 00:19:15.992 }' 00:19:15.992 10:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.992 10:45:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.561 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:16.561 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:16.561 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:16.561 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:16.561 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:16.561 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:16.561 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:16.561 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:16.561 [2024-07-12 10:45:51.751859] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:16.820 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:16.820 "name": "Existed_Raid", 00:19:16.820 "aliases": [ 00:19:16.820 "a30e2e72-2749-47ed-b8ef-87d95ac481bb" 00:19:16.820 ], 00:19:16.820 "product_name": "Raid Volume", 00:19:16.820 "block_size": 512, 00:19:16.820 "num_blocks": 262144, 00:19:16.820 "uuid": "a30e2e72-2749-47ed-b8ef-87d95ac481bb", 00:19:16.820 "assigned_rate_limits": { 00:19:16.820 "rw_ios_per_sec": 0, 00:19:16.820 "rw_mbytes_per_sec": 0, 00:19:16.820 "r_mbytes_per_sec": 0, 00:19:16.820 "w_mbytes_per_sec": 0 00:19:16.820 }, 00:19:16.820 "claimed": false, 00:19:16.820 "zoned": false, 00:19:16.820 "supported_io_types": { 00:19:16.820 "read": true, 00:19:16.820 "write": true, 00:19:16.820 "unmap": true, 00:19:16.820 "flush": true, 00:19:16.820 "reset": true, 00:19:16.820 "nvme_admin": false, 00:19:16.820 "nvme_io": false, 00:19:16.820 "nvme_io_md": false, 00:19:16.820 "write_zeroes": true, 00:19:16.820 "zcopy": false, 00:19:16.820 "get_zone_info": false, 00:19:16.820 "zone_management": false, 00:19:16.820 "zone_append": false, 00:19:16.820 "compare": false, 00:19:16.820 "compare_and_write": false, 00:19:16.820 "abort": false, 00:19:16.820 "seek_hole": false, 00:19:16.820 "seek_data": false, 00:19:16.820 "copy": false, 00:19:16.820 "nvme_iov_md": false 00:19:16.820 }, 00:19:16.820 "memory_domains": [ 00:19:16.820 { 00:19:16.820 "dma_device_id": "system", 00:19:16.820 "dma_device_type": 1 00:19:16.820 }, 00:19:16.820 { 00:19:16.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.820 "dma_device_type": 2 00:19:16.820 }, 00:19:16.820 { 00:19:16.820 "dma_device_id": "system", 00:19:16.820 "dma_device_type": 1 00:19:16.820 }, 00:19:16.820 { 00:19:16.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.820 "dma_device_type": 2 00:19:16.820 }, 00:19:16.820 { 00:19:16.820 "dma_device_id": "system", 00:19:16.820 "dma_device_type": 1 00:19:16.820 }, 00:19:16.820 { 00:19:16.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.820 "dma_device_type": 2 00:19:16.820 }, 00:19:16.820 { 00:19:16.820 "dma_device_id": "system", 00:19:16.820 "dma_device_type": 1 00:19:16.820 }, 00:19:16.820 { 00:19:16.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.820 "dma_device_type": 2 00:19:16.820 } 00:19:16.821 ], 00:19:16.821 "driver_specific": { 00:19:16.821 "raid": { 00:19:16.821 "uuid": "a30e2e72-2749-47ed-b8ef-87d95ac481bb", 00:19:16.821 "strip_size_kb": 64, 00:19:16.821 "state": "online", 00:19:16.821 "raid_level": "concat", 00:19:16.821 "superblock": false, 00:19:16.821 "num_base_bdevs": 4, 00:19:16.821 "num_base_bdevs_discovered": 4, 00:19:16.821 "num_base_bdevs_operational": 4, 00:19:16.821 "base_bdevs_list": [ 00:19:16.821 { 00:19:16.821 "name": "BaseBdev1", 00:19:16.821 "uuid": "35e41842-9c62-4798-acbb-e52e90f7018e", 00:19:16.821 "is_configured": true, 00:19:16.821 "data_offset": 0, 00:19:16.821 "data_size": 65536 00:19:16.821 }, 00:19:16.821 { 00:19:16.821 "name": "BaseBdev2", 00:19:16.821 "uuid": "908c9e3c-8c6b-4f98-ba12-a15ade966e26", 00:19:16.821 "is_configured": true, 00:19:16.821 "data_offset": 0, 00:19:16.821 "data_size": 65536 00:19:16.821 }, 00:19:16.821 { 00:19:16.821 "name": "BaseBdev3", 00:19:16.821 "uuid": "cbfc6348-d740-419b-9d4c-f7a76cfa360b", 00:19:16.821 "is_configured": true, 00:19:16.821 "data_offset": 0, 00:19:16.821 "data_size": 65536 00:19:16.821 }, 00:19:16.821 { 00:19:16.821 "name": "BaseBdev4", 00:19:16.821 "uuid": "a2393e9c-6442-4e0e-b78c-05a85390fc45", 00:19:16.821 "is_configured": true, 00:19:16.821 "data_offset": 0, 00:19:16.821 "data_size": 65536 00:19:16.821 } 00:19:16.821 ] 00:19:16.821 } 00:19:16.821 } 00:19:16.821 }' 00:19:16.821 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:16.821 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:16.821 BaseBdev2 00:19:16.821 BaseBdev3 00:19:16.821 BaseBdev4' 00:19:16.821 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:16.821 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:16.821 10:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:17.080 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:17.080 "name": "BaseBdev1", 00:19:17.080 "aliases": [ 00:19:17.080 "35e41842-9c62-4798-acbb-e52e90f7018e" 00:19:17.080 ], 00:19:17.080 "product_name": "Malloc disk", 00:19:17.080 "block_size": 512, 00:19:17.080 "num_blocks": 65536, 00:19:17.080 "uuid": "35e41842-9c62-4798-acbb-e52e90f7018e", 00:19:17.080 "assigned_rate_limits": { 00:19:17.080 "rw_ios_per_sec": 0, 00:19:17.080 "rw_mbytes_per_sec": 0, 00:19:17.080 "r_mbytes_per_sec": 0, 00:19:17.080 "w_mbytes_per_sec": 0 00:19:17.080 }, 00:19:17.080 "claimed": true, 00:19:17.080 "claim_type": "exclusive_write", 00:19:17.080 "zoned": false, 00:19:17.080 "supported_io_types": { 00:19:17.080 "read": true, 00:19:17.080 "write": true, 00:19:17.080 "unmap": true, 00:19:17.080 "flush": true, 00:19:17.080 "reset": true, 00:19:17.080 "nvme_admin": false, 00:19:17.080 "nvme_io": false, 00:19:17.080 "nvme_io_md": false, 00:19:17.080 "write_zeroes": true, 00:19:17.080 "zcopy": true, 00:19:17.080 "get_zone_info": false, 00:19:17.080 "zone_management": false, 00:19:17.080 "zone_append": false, 00:19:17.080 "compare": false, 00:19:17.080 "compare_and_write": false, 00:19:17.080 "abort": true, 00:19:17.080 "seek_hole": false, 00:19:17.080 "seek_data": false, 00:19:17.080 "copy": true, 00:19:17.080 "nvme_iov_md": false 00:19:17.080 }, 00:19:17.080 "memory_domains": [ 00:19:17.080 { 00:19:17.080 "dma_device_id": "system", 00:19:17.080 "dma_device_type": 1 00:19:17.080 }, 00:19:17.080 { 00:19:17.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:17.080 "dma_device_type": 2 00:19:17.080 } 00:19:17.080 ], 00:19:17.080 "driver_specific": {} 00:19:17.080 }' 00:19:17.080 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:17.080 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:17.080 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:17.080 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:17.080 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:17.080 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:17.080 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:17.339 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:17.339 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:17.339 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:17.339 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:17.339 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:17.339 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:17.339 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:17.339 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:17.597 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:17.597 "name": "BaseBdev2", 00:19:17.597 "aliases": [ 00:19:17.597 "908c9e3c-8c6b-4f98-ba12-a15ade966e26" 00:19:17.597 ], 00:19:17.597 "product_name": "Malloc disk", 00:19:17.597 "block_size": 512, 00:19:17.597 "num_blocks": 65536, 00:19:17.597 "uuid": "908c9e3c-8c6b-4f98-ba12-a15ade966e26", 00:19:17.597 "assigned_rate_limits": { 00:19:17.597 "rw_ios_per_sec": 0, 00:19:17.597 "rw_mbytes_per_sec": 0, 00:19:17.597 "r_mbytes_per_sec": 0, 00:19:17.597 "w_mbytes_per_sec": 0 00:19:17.597 }, 00:19:17.597 "claimed": true, 00:19:17.597 "claim_type": "exclusive_write", 00:19:17.597 "zoned": false, 00:19:17.597 "supported_io_types": { 00:19:17.597 "read": true, 00:19:17.597 "write": true, 00:19:17.597 "unmap": true, 00:19:17.597 "flush": true, 00:19:17.597 "reset": true, 00:19:17.597 "nvme_admin": false, 00:19:17.597 "nvme_io": false, 00:19:17.597 "nvme_io_md": false, 00:19:17.597 "write_zeroes": true, 00:19:17.597 "zcopy": true, 00:19:17.597 "get_zone_info": false, 00:19:17.597 "zone_management": false, 00:19:17.597 "zone_append": false, 00:19:17.597 "compare": false, 00:19:17.597 "compare_and_write": false, 00:19:17.597 "abort": true, 00:19:17.597 "seek_hole": false, 00:19:17.597 "seek_data": false, 00:19:17.597 "copy": true, 00:19:17.597 "nvme_iov_md": false 00:19:17.597 }, 00:19:17.597 "memory_domains": [ 00:19:17.597 { 00:19:17.597 "dma_device_id": "system", 00:19:17.597 "dma_device_type": 1 00:19:17.597 }, 00:19:17.597 { 00:19:17.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:17.597 "dma_device_type": 2 00:19:17.597 } 00:19:17.597 ], 00:19:17.597 "driver_specific": {} 00:19:17.597 }' 00:19:17.597 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:17.597 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:17.597 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:17.597 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:17.855 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:17.855 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:17.855 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:17.855 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:17.855 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:17.855 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:17.855 10:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:17.855 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:17.855 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:17.855 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:17.855 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:18.113 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:18.113 "name": "BaseBdev3", 00:19:18.113 "aliases": [ 00:19:18.113 "cbfc6348-d740-419b-9d4c-f7a76cfa360b" 00:19:18.113 ], 00:19:18.113 "product_name": "Malloc disk", 00:19:18.113 "block_size": 512, 00:19:18.113 "num_blocks": 65536, 00:19:18.113 "uuid": "cbfc6348-d740-419b-9d4c-f7a76cfa360b", 00:19:18.113 "assigned_rate_limits": { 00:19:18.113 "rw_ios_per_sec": 0, 00:19:18.113 "rw_mbytes_per_sec": 0, 00:19:18.113 "r_mbytes_per_sec": 0, 00:19:18.113 "w_mbytes_per_sec": 0 00:19:18.113 }, 00:19:18.113 "claimed": true, 00:19:18.113 "claim_type": "exclusive_write", 00:19:18.113 "zoned": false, 00:19:18.113 "supported_io_types": { 00:19:18.113 "read": true, 00:19:18.113 "write": true, 00:19:18.113 "unmap": true, 00:19:18.113 "flush": true, 00:19:18.113 "reset": true, 00:19:18.113 "nvme_admin": false, 00:19:18.113 "nvme_io": false, 00:19:18.113 "nvme_io_md": false, 00:19:18.113 "write_zeroes": true, 00:19:18.113 "zcopy": true, 00:19:18.113 "get_zone_info": false, 00:19:18.113 "zone_management": false, 00:19:18.113 "zone_append": false, 00:19:18.113 "compare": false, 00:19:18.113 "compare_and_write": false, 00:19:18.113 "abort": true, 00:19:18.113 "seek_hole": false, 00:19:18.113 "seek_data": false, 00:19:18.113 "copy": true, 00:19:18.113 "nvme_iov_md": false 00:19:18.113 }, 00:19:18.113 "memory_domains": [ 00:19:18.113 { 00:19:18.113 "dma_device_id": "system", 00:19:18.113 "dma_device_type": 1 00:19:18.113 }, 00:19:18.113 { 00:19:18.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.113 "dma_device_type": 2 00:19:18.113 } 00:19:18.113 ], 00:19:18.113 "driver_specific": {} 00:19:18.113 }' 00:19:18.113 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:18.371 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:18.629 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:18.629 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:18.629 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:18.629 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:18.887 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:18.887 "name": "BaseBdev4", 00:19:18.887 "aliases": [ 00:19:18.887 "a2393e9c-6442-4e0e-b78c-05a85390fc45" 00:19:18.887 ], 00:19:18.887 "product_name": "Malloc disk", 00:19:18.887 "block_size": 512, 00:19:18.887 "num_blocks": 65536, 00:19:18.887 "uuid": "a2393e9c-6442-4e0e-b78c-05a85390fc45", 00:19:18.887 "assigned_rate_limits": { 00:19:18.887 "rw_ios_per_sec": 0, 00:19:18.887 "rw_mbytes_per_sec": 0, 00:19:18.887 "r_mbytes_per_sec": 0, 00:19:18.887 "w_mbytes_per_sec": 0 00:19:18.887 }, 00:19:18.887 "claimed": true, 00:19:18.887 "claim_type": "exclusive_write", 00:19:18.887 "zoned": false, 00:19:18.887 "supported_io_types": { 00:19:18.887 "read": true, 00:19:18.887 "write": true, 00:19:18.887 "unmap": true, 00:19:18.887 "flush": true, 00:19:18.887 "reset": true, 00:19:18.887 "nvme_admin": false, 00:19:18.887 "nvme_io": false, 00:19:18.887 "nvme_io_md": false, 00:19:18.887 "write_zeroes": true, 00:19:18.887 "zcopy": true, 00:19:18.887 "get_zone_info": false, 00:19:18.887 "zone_management": false, 00:19:18.887 "zone_append": false, 00:19:18.887 "compare": false, 00:19:18.887 "compare_and_write": false, 00:19:18.887 "abort": true, 00:19:18.887 "seek_hole": false, 00:19:18.887 "seek_data": false, 00:19:18.887 "copy": true, 00:19:18.887 "nvme_iov_md": false 00:19:18.887 }, 00:19:18.887 "memory_domains": [ 00:19:18.887 { 00:19:18.887 "dma_device_id": "system", 00:19:18.887 "dma_device_type": 1 00:19:18.887 }, 00:19:18.887 { 00:19:18.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.887 "dma_device_type": 2 00:19:18.887 } 00:19:18.887 ], 00:19:18.887 "driver_specific": {} 00:19:18.887 }' 00:19:18.887 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:18.887 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:18.887 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:18.887 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:18.887 10:45:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:18.887 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:18.887 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:18.887 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:19.146 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:19.146 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:19.146 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:19.146 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:19.146 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:19.404 [2024-07-12 10:45:54.422695] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:19.404 [2024-07-12 10:45:54.422722] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:19.404 [2024-07-12 10:45:54.422768] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:19.404 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:19.404 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:19.404 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:19.404 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:19.404 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:19.404 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:19.404 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.404 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:19.405 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:19.405 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:19.405 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:19.405 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.405 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.405 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.405 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.405 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.405 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.664 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.664 "name": "Existed_Raid", 00:19:19.664 "uuid": "a30e2e72-2749-47ed-b8ef-87d95ac481bb", 00:19:19.664 "strip_size_kb": 64, 00:19:19.664 "state": "offline", 00:19:19.664 "raid_level": "concat", 00:19:19.664 "superblock": false, 00:19:19.664 "num_base_bdevs": 4, 00:19:19.664 "num_base_bdevs_discovered": 3, 00:19:19.664 "num_base_bdevs_operational": 3, 00:19:19.664 "base_bdevs_list": [ 00:19:19.664 { 00:19:19.664 "name": null, 00:19:19.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.664 "is_configured": false, 00:19:19.664 "data_offset": 0, 00:19:19.664 "data_size": 65536 00:19:19.664 }, 00:19:19.664 { 00:19:19.664 "name": "BaseBdev2", 00:19:19.664 "uuid": "908c9e3c-8c6b-4f98-ba12-a15ade966e26", 00:19:19.664 "is_configured": true, 00:19:19.664 "data_offset": 0, 00:19:19.664 "data_size": 65536 00:19:19.664 }, 00:19:19.664 { 00:19:19.664 "name": "BaseBdev3", 00:19:19.664 "uuid": "cbfc6348-d740-419b-9d4c-f7a76cfa360b", 00:19:19.664 "is_configured": true, 00:19:19.664 "data_offset": 0, 00:19:19.664 "data_size": 65536 00:19:19.664 }, 00:19:19.664 { 00:19:19.664 "name": "BaseBdev4", 00:19:19.664 "uuid": "a2393e9c-6442-4e0e-b78c-05a85390fc45", 00:19:19.664 "is_configured": true, 00:19:19.664 "data_offset": 0, 00:19:19.664 "data_size": 65536 00:19:19.664 } 00:19:19.664 ] 00:19:19.664 }' 00:19:19.664 10:45:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.664 10:45:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:20.229 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:20.229 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:20.229 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.229 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:20.488 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:20.488 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:20.488 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:20.746 [2024-07-12 10:45:55.752058] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:20.746 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:20.746 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:20.746 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.746 10:45:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:21.004 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:21.004 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:21.004 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:21.263 [2024-07-12 10:45:56.257824] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:21.263 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:21.263 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:21.263 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.263 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:21.521 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:21.521 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:21.521 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:21.779 [2024-07-12 10:45:56.757762] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:21.779 [2024-07-12 10:45:56.757803] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1762350 name Existed_Raid, state offline 00:19:21.779 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:21.779 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:21.779 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.779 10:45:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:22.038 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:22.038 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:22.038 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:22.038 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:22.038 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:22.038 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:22.296 BaseBdev2 00:19:22.296 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:22.296 10:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:22.296 10:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:22.296 10:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:22.296 10:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:22.296 10:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:22.296 10:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:22.554 10:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:22.554 [ 00:19:22.554 { 00:19:22.554 "name": "BaseBdev2", 00:19:22.554 "aliases": [ 00:19:22.554 "5de64052-d91e-42d2-bd0d-f2b183a2aeab" 00:19:22.554 ], 00:19:22.554 "product_name": "Malloc disk", 00:19:22.554 "block_size": 512, 00:19:22.554 "num_blocks": 65536, 00:19:22.554 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:22.554 "assigned_rate_limits": { 00:19:22.554 "rw_ios_per_sec": 0, 00:19:22.554 "rw_mbytes_per_sec": 0, 00:19:22.554 "r_mbytes_per_sec": 0, 00:19:22.554 "w_mbytes_per_sec": 0 00:19:22.554 }, 00:19:22.554 "claimed": false, 00:19:22.554 "zoned": false, 00:19:22.554 "supported_io_types": { 00:19:22.555 "read": true, 00:19:22.555 "write": true, 00:19:22.555 "unmap": true, 00:19:22.555 "flush": true, 00:19:22.555 "reset": true, 00:19:22.555 "nvme_admin": false, 00:19:22.555 "nvme_io": false, 00:19:22.555 "nvme_io_md": false, 00:19:22.555 "write_zeroes": true, 00:19:22.555 "zcopy": true, 00:19:22.555 "get_zone_info": false, 00:19:22.555 "zone_management": false, 00:19:22.555 "zone_append": false, 00:19:22.555 "compare": false, 00:19:22.555 "compare_and_write": false, 00:19:22.555 "abort": true, 00:19:22.555 "seek_hole": false, 00:19:22.555 "seek_data": false, 00:19:22.555 "copy": true, 00:19:22.555 "nvme_iov_md": false 00:19:22.555 }, 00:19:22.555 "memory_domains": [ 00:19:22.555 { 00:19:22.555 "dma_device_id": "system", 00:19:22.555 "dma_device_type": 1 00:19:22.555 }, 00:19:22.555 { 00:19:22.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.555 "dma_device_type": 2 00:19:22.555 } 00:19:22.555 ], 00:19:22.555 "driver_specific": {} 00:19:22.555 } 00:19:22.555 ] 00:19:22.852 10:45:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:22.852 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:22.852 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:22.852 10:45:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:22.852 BaseBdev3 00:19:22.852 10:45:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:22.852 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:22.852 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:22.852 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:22.852 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:22.852 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:23.140 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:23.140 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:23.398 [ 00:19:23.398 { 00:19:23.398 "name": "BaseBdev3", 00:19:23.398 "aliases": [ 00:19:23.398 "680cc2e0-3834-46c9-8d27-1f2d43d0b8de" 00:19:23.398 ], 00:19:23.398 "product_name": "Malloc disk", 00:19:23.398 "block_size": 512, 00:19:23.398 "num_blocks": 65536, 00:19:23.398 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:23.398 "assigned_rate_limits": { 00:19:23.398 "rw_ios_per_sec": 0, 00:19:23.398 "rw_mbytes_per_sec": 0, 00:19:23.398 "r_mbytes_per_sec": 0, 00:19:23.398 "w_mbytes_per_sec": 0 00:19:23.398 }, 00:19:23.398 "claimed": false, 00:19:23.398 "zoned": false, 00:19:23.398 "supported_io_types": { 00:19:23.398 "read": true, 00:19:23.398 "write": true, 00:19:23.398 "unmap": true, 00:19:23.398 "flush": true, 00:19:23.398 "reset": true, 00:19:23.398 "nvme_admin": false, 00:19:23.398 "nvme_io": false, 00:19:23.398 "nvme_io_md": false, 00:19:23.398 "write_zeroes": true, 00:19:23.398 "zcopy": true, 00:19:23.398 "get_zone_info": false, 00:19:23.398 "zone_management": false, 00:19:23.398 "zone_append": false, 00:19:23.398 "compare": false, 00:19:23.398 "compare_and_write": false, 00:19:23.398 "abort": true, 00:19:23.398 "seek_hole": false, 00:19:23.398 "seek_data": false, 00:19:23.398 "copy": true, 00:19:23.398 "nvme_iov_md": false 00:19:23.398 }, 00:19:23.398 "memory_domains": [ 00:19:23.398 { 00:19:23.398 "dma_device_id": "system", 00:19:23.398 "dma_device_type": 1 00:19:23.398 }, 00:19:23.398 { 00:19:23.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.398 "dma_device_type": 2 00:19:23.398 } 00:19:23.398 ], 00:19:23.398 "driver_specific": {} 00:19:23.398 } 00:19:23.398 ] 00:19:23.398 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:23.398 10:45:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:23.398 10:45:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:23.398 10:45:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:23.656 BaseBdev4 00:19:23.656 10:45:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:23.656 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:23.656 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:23.656 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:23.656 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:23.656 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:23.656 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:23.914 10:45:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:24.172 [ 00:19:24.172 { 00:19:24.172 "name": "BaseBdev4", 00:19:24.172 "aliases": [ 00:19:24.172 "7f733969-e750-4066-aa67-519c1a288201" 00:19:24.172 ], 00:19:24.172 "product_name": "Malloc disk", 00:19:24.172 "block_size": 512, 00:19:24.172 "num_blocks": 65536, 00:19:24.172 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:24.172 "assigned_rate_limits": { 00:19:24.172 "rw_ios_per_sec": 0, 00:19:24.172 "rw_mbytes_per_sec": 0, 00:19:24.172 "r_mbytes_per_sec": 0, 00:19:24.172 "w_mbytes_per_sec": 0 00:19:24.172 }, 00:19:24.172 "claimed": false, 00:19:24.172 "zoned": false, 00:19:24.172 "supported_io_types": { 00:19:24.172 "read": true, 00:19:24.172 "write": true, 00:19:24.172 "unmap": true, 00:19:24.172 "flush": true, 00:19:24.172 "reset": true, 00:19:24.172 "nvme_admin": false, 00:19:24.172 "nvme_io": false, 00:19:24.172 "nvme_io_md": false, 00:19:24.172 "write_zeroes": true, 00:19:24.172 "zcopy": true, 00:19:24.172 "get_zone_info": false, 00:19:24.172 "zone_management": false, 00:19:24.172 "zone_append": false, 00:19:24.172 "compare": false, 00:19:24.172 "compare_and_write": false, 00:19:24.172 "abort": true, 00:19:24.172 "seek_hole": false, 00:19:24.172 "seek_data": false, 00:19:24.172 "copy": true, 00:19:24.172 "nvme_iov_md": false 00:19:24.172 }, 00:19:24.172 "memory_domains": [ 00:19:24.172 { 00:19:24.172 "dma_device_id": "system", 00:19:24.172 "dma_device_type": 1 00:19:24.172 }, 00:19:24.172 { 00:19:24.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.172 "dma_device_type": 2 00:19:24.172 } 00:19:24.172 ], 00:19:24.172 "driver_specific": {} 00:19:24.172 } 00:19:24.172 ] 00:19:24.172 10:45:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:24.172 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:24.172 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:24.172 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:24.429 [2024-07-12 10:45:59.418420] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:24.429 [2024-07-12 10:45:59.418460] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:24.429 [2024-07-12 10:45:59.418480] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:24.429 [2024-07-12 10:45:59.419844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:24.429 [2024-07-12 10:45:59.419891] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.429 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:24.686 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.686 "name": "Existed_Raid", 00:19:24.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.686 "strip_size_kb": 64, 00:19:24.686 "state": "configuring", 00:19:24.686 "raid_level": "concat", 00:19:24.686 "superblock": false, 00:19:24.686 "num_base_bdevs": 4, 00:19:24.686 "num_base_bdevs_discovered": 3, 00:19:24.686 "num_base_bdevs_operational": 4, 00:19:24.686 "base_bdevs_list": [ 00:19:24.686 { 00:19:24.686 "name": "BaseBdev1", 00:19:24.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.686 "is_configured": false, 00:19:24.686 "data_offset": 0, 00:19:24.686 "data_size": 0 00:19:24.686 }, 00:19:24.686 { 00:19:24.686 "name": "BaseBdev2", 00:19:24.687 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:24.687 "is_configured": true, 00:19:24.687 "data_offset": 0, 00:19:24.687 "data_size": 65536 00:19:24.687 }, 00:19:24.687 { 00:19:24.687 "name": "BaseBdev3", 00:19:24.687 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:24.687 "is_configured": true, 00:19:24.687 "data_offset": 0, 00:19:24.687 "data_size": 65536 00:19:24.687 }, 00:19:24.687 { 00:19:24.687 "name": "BaseBdev4", 00:19:24.687 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:24.687 "is_configured": true, 00:19:24.687 "data_offset": 0, 00:19:24.687 "data_size": 65536 00:19:24.687 } 00:19:24.687 ] 00:19:24.687 }' 00:19:24.687 10:45:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.687 10:45:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:25.251 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:25.508 [2024-07-12 10:46:00.485356] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.508 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.764 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.764 "name": "Existed_Raid", 00:19:25.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.765 "strip_size_kb": 64, 00:19:25.765 "state": "configuring", 00:19:25.765 "raid_level": "concat", 00:19:25.765 "superblock": false, 00:19:25.765 "num_base_bdevs": 4, 00:19:25.765 "num_base_bdevs_discovered": 2, 00:19:25.765 "num_base_bdevs_operational": 4, 00:19:25.765 "base_bdevs_list": [ 00:19:25.765 { 00:19:25.765 "name": "BaseBdev1", 00:19:25.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.765 "is_configured": false, 00:19:25.765 "data_offset": 0, 00:19:25.765 "data_size": 0 00:19:25.765 }, 00:19:25.765 { 00:19:25.765 "name": null, 00:19:25.765 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:25.765 "is_configured": false, 00:19:25.765 "data_offset": 0, 00:19:25.765 "data_size": 65536 00:19:25.765 }, 00:19:25.765 { 00:19:25.765 "name": "BaseBdev3", 00:19:25.765 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:25.765 "is_configured": true, 00:19:25.765 "data_offset": 0, 00:19:25.765 "data_size": 65536 00:19:25.765 }, 00:19:25.765 { 00:19:25.765 "name": "BaseBdev4", 00:19:25.765 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:25.765 "is_configured": true, 00:19:25.765 "data_offset": 0, 00:19:25.765 "data_size": 65536 00:19:25.765 } 00:19:25.765 ] 00:19:25.765 }' 00:19:25.765 10:46:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.765 10:46:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:26.329 10:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:26.329 10:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.586 10:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:26.586 10:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:26.844 [2024-07-12 10:46:01.852303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:26.844 BaseBdev1 00:19:26.844 10:46:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:26.844 10:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:26.844 10:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:26.844 10:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:26.844 10:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:26.844 10:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:26.844 10:46:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:27.101 10:46:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:27.359 [ 00:19:27.359 { 00:19:27.359 "name": "BaseBdev1", 00:19:27.359 "aliases": [ 00:19:27.359 "cef5aeaa-6fbe-45e1-b776-7cd8aff52568" 00:19:27.359 ], 00:19:27.359 "product_name": "Malloc disk", 00:19:27.359 "block_size": 512, 00:19:27.359 "num_blocks": 65536, 00:19:27.359 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:27.359 "assigned_rate_limits": { 00:19:27.359 "rw_ios_per_sec": 0, 00:19:27.359 "rw_mbytes_per_sec": 0, 00:19:27.359 "r_mbytes_per_sec": 0, 00:19:27.359 "w_mbytes_per_sec": 0 00:19:27.359 }, 00:19:27.359 "claimed": true, 00:19:27.359 "claim_type": "exclusive_write", 00:19:27.359 "zoned": false, 00:19:27.359 "supported_io_types": { 00:19:27.359 "read": true, 00:19:27.359 "write": true, 00:19:27.359 "unmap": true, 00:19:27.359 "flush": true, 00:19:27.359 "reset": true, 00:19:27.359 "nvme_admin": false, 00:19:27.359 "nvme_io": false, 00:19:27.359 "nvme_io_md": false, 00:19:27.359 "write_zeroes": true, 00:19:27.360 "zcopy": true, 00:19:27.360 "get_zone_info": false, 00:19:27.360 "zone_management": false, 00:19:27.360 "zone_append": false, 00:19:27.360 "compare": false, 00:19:27.360 "compare_and_write": false, 00:19:27.360 "abort": true, 00:19:27.360 "seek_hole": false, 00:19:27.360 "seek_data": false, 00:19:27.360 "copy": true, 00:19:27.360 "nvme_iov_md": false 00:19:27.360 }, 00:19:27.360 "memory_domains": [ 00:19:27.360 { 00:19:27.360 "dma_device_id": "system", 00:19:27.360 "dma_device_type": 1 00:19:27.360 }, 00:19:27.360 { 00:19:27.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.360 "dma_device_type": 2 00:19:27.360 } 00:19:27.360 ], 00:19:27.360 "driver_specific": {} 00:19:27.360 } 00:19:27.360 ] 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.360 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:27.618 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.618 "name": "Existed_Raid", 00:19:27.618 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.618 "strip_size_kb": 64, 00:19:27.618 "state": "configuring", 00:19:27.618 "raid_level": "concat", 00:19:27.618 "superblock": false, 00:19:27.618 "num_base_bdevs": 4, 00:19:27.618 "num_base_bdevs_discovered": 3, 00:19:27.618 "num_base_bdevs_operational": 4, 00:19:27.618 "base_bdevs_list": [ 00:19:27.618 { 00:19:27.618 "name": "BaseBdev1", 00:19:27.618 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:27.618 "is_configured": true, 00:19:27.618 "data_offset": 0, 00:19:27.618 "data_size": 65536 00:19:27.618 }, 00:19:27.618 { 00:19:27.618 "name": null, 00:19:27.618 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:27.618 "is_configured": false, 00:19:27.618 "data_offset": 0, 00:19:27.618 "data_size": 65536 00:19:27.618 }, 00:19:27.618 { 00:19:27.618 "name": "BaseBdev3", 00:19:27.618 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:27.618 "is_configured": true, 00:19:27.618 "data_offset": 0, 00:19:27.618 "data_size": 65536 00:19:27.618 }, 00:19:27.618 { 00:19:27.618 "name": "BaseBdev4", 00:19:27.618 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:27.618 "is_configured": true, 00:19:27.618 "data_offset": 0, 00:19:27.618 "data_size": 65536 00:19:27.618 } 00:19:27.618 ] 00:19:27.618 }' 00:19:27.618 10:46:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.618 10:46:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.183 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.183 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:28.183 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:28.183 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:28.440 [2024-07-12 10:46:03.600976] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:28.440 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:28.440 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:28.440 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:28.440 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:28.440 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:28.440 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:28.441 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:28.441 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:28.441 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:28.441 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:28.441 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.441 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:28.698 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.698 "name": "Existed_Raid", 00:19:28.698 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:28.698 "strip_size_kb": 64, 00:19:28.698 "state": "configuring", 00:19:28.698 "raid_level": "concat", 00:19:28.698 "superblock": false, 00:19:28.698 "num_base_bdevs": 4, 00:19:28.698 "num_base_bdevs_discovered": 2, 00:19:28.698 "num_base_bdevs_operational": 4, 00:19:28.698 "base_bdevs_list": [ 00:19:28.698 { 00:19:28.698 "name": "BaseBdev1", 00:19:28.698 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:28.698 "is_configured": true, 00:19:28.698 "data_offset": 0, 00:19:28.698 "data_size": 65536 00:19:28.698 }, 00:19:28.698 { 00:19:28.698 "name": null, 00:19:28.698 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:28.698 "is_configured": false, 00:19:28.698 "data_offset": 0, 00:19:28.698 "data_size": 65536 00:19:28.698 }, 00:19:28.698 { 00:19:28.698 "name": null, 00:19:28.698 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:28.698 "is_configured": false, 00:19:28.698 "data_offset": 0, 00:19:28.698 "data_size": 65536 00:19:28.698 }, 00:19:28.698 { 00:19:28.698 "name": "BaseBdev4", 00:19:28.698 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:28.698 "is_configured": true, 00:19:28.698 "data_offset": 0, 00:19:28.698 "data_size": 65536 00:19:28.698 } 00:19:28.698 ] 00:19:28.698 }' 00:19:28.698 10:46:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.698 10:46:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:29.631 10:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.631 10:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:29.631 10:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:29.631 10:46:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:30.197 [2024-07-12 10:46:05.197408] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.197 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:30.455 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:30.455 "name": "Existed_Raid", 00:19:30.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:30.455 "strip_size_kb": 64, 00:19:30.455 "state": "configuring", 00:19:30.455 "raid_level": "concat", 00:19:30.455 "superblock": false, 00:19:30.455 "num_base_bdevs": 4, 00:19:30.455 "num_base_bdevs_discovered": 3, 00:19:30.455 "num_base_bdevs_operational": 4, 00:19:30.455 "base_bdevs_list": [ 00:19:30.455 { 00:19:30.455 "name": "BaseBdev1", 00:19:30.455 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:30.455 "is_configured": true, 00:19:30.455 "data_offset": 0, 00:19:30.455 "data_size": 65536 00:19:30.455 }, 00:19:30.455 { 00:19:30.455 "name": null, 00:19:30.455 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:30.455 "is_configured": false, 00:19:30.455 "data_offset": 0, 00:19:30.455 "data_size": 65536 00:19:30.455 }, 00:19:30.455 { 00:19:30.455 "name": "BaseBdev3", 00:19:30.455 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:30.455 "is_configured": true, 00:19:30.455 "data_offset": 0, 00:19:30.455 "data_size": 65536 00:19:30.455 }, 00:19:30.455 { 00:19:30.455 "name": "BaseBdev4", 00:19:30.455 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:30.455 "is_configured": true, 00:19:30.455 "data_offset": 0, 00:19:30.456 "data_size": 65536 00:19:30.456 } 00:19:30.456 ] 00:19:30.456 }' 00:19:30.456 10:46:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:30.456 10:46:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:31.022 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.022 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:31.280 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:31.280 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:31.538 [2024-07-12 10:46:06.532947] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.538 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:31.796 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.796 "name": "Existed_Raid", 00:19:31.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.796 "strip_size_kb": 64, 00:19:31.796 "state": "configuring", 00:19:31.796 "raid_level": "concat", 00:19:31.796 "superblock": false, 00:19:31.796 "num_base_bdevs": 4, 00:19:31.796 "num_base_bdevs_discovered": 2, 00:19:31.796 "num_base_bdevs_operational": 4, 00:19:31.796 "base_bdevs_list": [ 00:19:31.796 { 00:19:31.796 "name": null, 00:19:31.796 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:31.796 "is_configured": false, 00:19:31.796 "data_offset": 0, 00:19:31.796 "data_size": 65536 00:19:31.796 }, 00:19:31.796 { 00:19:31.796 "name": null, 00:19:31.796 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:31.796 "is_configured": false, 00:19:31.796 "data_offset": 0, 00:19:31.796 "data_size": 65536 00:19:31.796 }, 00:19:31.796 { 00:19:31.796 "name": "BaseBdev3", 00:19:31.796 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:31.796 "is_configured": true, 00:19:31.796 "data_offset": 0, 00:19:31.796 "data_size": 65536 00:19:31.796 }, 00:19:31.796 { 00:19:31.796 "name": "BaseBdev4", 00:19:31.796 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:31.796 "is_configured": true, 00:19:31.796 "data_offset": 0, 00:19:31.796 "data_size": 65536 00:19:31.796 } 00:19:31.796 ] 00:19:31.796 }' 00:19:31.796 10:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.797 10:46:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:32.361 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.361 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:32.620 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:32.620 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:32.620 [2024-07-12 10:46:07.796734] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:32.620 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:32.620 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.620 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:32.879 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:32.879 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:32.879 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.879 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.879 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.879 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.879 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.879 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.879 10:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:32.879 10:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.879 "name": "Existed_Raid", 00:19:32.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.879 "strip_size_kb": 64, 00:19:32.879 "state": "configuring", 00:19:32.879 "raid_level": "concat", 00:19:32.879 "superblock": false, 00:19:32.879 "num_base_bdevs": 4, 00:19:32.879 "num_base_bdevs_discovered": 3, 00:19:32.879 "num_base_bdevs_operational": 4, 00:19:32.879 "base_bdevs_list": [ 00:19:32.879 { 00:19:32.879 "name": null, 00:19:32.879 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:32.879 "is_configured": false, 00:19:32.879 "data_offset": 0, 00:19:32.879 "data_size": 65536 00:19:32.879 }, 00:19:32.879 { 00:19:32.879 "name": "BaseBdev2", 00:19:32.879 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:32.879 "is_configured": true, 00:19:32.879 "data_offset": 0, 00:19:32.879 "data_size": 65536 00:19:32.879 }, 00:19:32.879 { 00:19:32.879 "name": "BaseBdev3", 00:19:32.879 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:32.879 "is_configured": true, 00:19:32.879 "data_offset": 0, 00:19:32.879 "data_size": 65536 00:19:32.879 }, 00:19:32.879 { 00:19:32.879 "name": "BaseBdev4", 00:19:32.879 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:32.879 "is_configured": true, 00:19:32.879 "data_offset": 0, 00:19:32.879 "data_size": 65536 00:19:32.879 } 00:19:32.879 ] 00:19:32.879 }' 00:19:32.879 10:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.879 10:46:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.813 10:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.813 10:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:33.813 10:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:33.813 10:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.813 10:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:34.071 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u cef5aeaa-6fbe-45e1-b776-7cd8aff52568 00:19:34.329 [2024-07-12 10:46:09.352232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:34.329 [2024-07-12 10:46:09.352268] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1766040 00:19:34.329 [2024-07-12 10:46:09.352278] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:34.329 [2024-07-12 10:46:09.352469] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1761a70 00:19:34.329 [2024-07-12 10:46:09.352593] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1766040 00:19:34.329 [2024-07-12 10:46:09.352603] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1766040 00:19:34.329 [2024-07-12 10:46:09.352758] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:34.329 NewBaseBdev 00:19:34.329 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:34.329 10:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:34.329 10:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:34.329 10:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:34.329 10:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:34.329 10:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:34.329 10:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:34.586 10:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:34.843 [ 00:19:34.843 { 00:19:34.843 "name": "NewBaseBdev", 00:19:34.843 "aliases": [ 00:19:34.843 "cef5aeaa-6fbe-45e1-b776-7cd8aff52568" 00:19:34.843 ], 00:19:34.843 "product_name": "Malloc disk", 00:19:34.843 "block_size": 512, 00:19:34.843 "num_blocks": 65536, 00:19:34.843 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:34.843 "assigned_rate_limits": { 00:19:34.843 "rw_ios_per_sec": 0, 00:19:34.843 "rw_mbytes_per_sec": 0, 00:19:34.843 "r_mbytes_per_sec": 0, 00:19:34.843 "w_mbytes_per_sec": 0 00:19:34.843 }, 00:19:34.843 "claimed": true, 00:19:34.843 "claim_type": "exclusive_write", 00:19:34.843 "zoned": false, 00:19:34.843 "supported_io_types": { 00:19:34.843 "read": true, 00:19:34.843 "write": true, 00:19:34.843 "unmap": true, 00:19:34.843 "flush": true, 00:19:34.843 "reset": true, 00:19:34.843 "nvme_admin": false, 00:19:34.843 "nvme_io": false, 00:19:34.843 "nvme_io_md": false, 00:19:34.843 "write_zeroes": true, 00:19:34.843 "zcopy": true, 00:19:34.843 "get_zone_info": false, 00:19:34.843 "zone_management": false, 00:19:34.843 "zone_append": false, 00:19:34.844 "compare": false, 00:19:34.844 "compare_and_write": false, 00:19:34.844 "abort": true, 00:19:34.844 "seek_hole": false, 00:19:34.844 "seek_data": false, 00:19:34.844 "copy": true, 00:19:34.844 "nvme_iov_md": false 00:19:34.844 }, 00:19:34.844 "memory_domains": [ 00:19:34.844 { 00:19:34.844 "dma_device_id": "system", 00:19:34.844 "dma_device_type": 1 00:19:34.844 }, 00:19:34.844 { 00:19:34.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.844 "dma_device_type": 2 00:19:34.844 } 00:19:34.844 ], 00:19:34.844 "driver_specific": {} 00:19:34.844 } 00:19:34.844 ] 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.844 10:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.101 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.101 "name": "Existed_Raid", 00:19:35.101 "uuid": "5dca1193-5c3c-49c4-ac39-f2617a335251", 00:19:35.101 "strip_size_kb": 64, 00:19:35.101 "state": "online", 00:19:35.101 "raid_level": "concat", 00:19:35.101 "superblock": false, 00:19:35.101 "num_base_bdevs": 4, 00:19:35.101 "num_base_bdevs_discovered": 4, 00:19:35.101 "num_base_bdevs_operational": 4, 00:19:35.101 "base_bdevs_list": [ 00:19:35.101 { 00:19:35.101 "name": "NewBaseBdev", 00:19:35.101 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:35.101 "is_configured": true, 00:19:35.101 "data_offset": 0, 00:19:35.101 "data_size": 65536 00:19:35.101 }, 00:19:35.101 { 00:19:35.101 "name": "BaseBdev2", 00:19:35.101 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:35.101 "is_configured": true, 00:19:35.101 "data_offset": 0, 00:19:35.101 "data_size": 65536 00:19:35.101 }, 00:19:35.101 { 00:19:35.101 "name": "BaseBdev3", 00:19:35.101 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:35.101 "is_configured": true, 00:19:35.101 "data_offset": 0, 00:19:35.101 "data_size": 65536 00:19:35.101 }, 00:19:35.101 { 00:19:35.101 "name": "BaseBdev4", 00:19:35.101 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:35.101 "is_configured": true, 00:19:35.101 "data_offset": 0, 00:19:35.101 "data_size": 65536 00:19:35.101 } 00:19:35.101 ] 00:19:35.101 }' 00:19:35.101 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.101 10:46:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:35.667 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:35.667 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:35.667 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:35.667 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:35.667 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:35.667 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:35.667 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:35.667 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:35.667 [2024-07-12 10:46:10.840659] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:35.926 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:35.926 "name": "Existed_Raid", 00:19:35.926 "aliases": [ 00:19:35.926 "5dca1193-5c3c-49c4-ac39-f2617a335251" 00:19:35.926 ], 00:19:35.926 "product_name": "Raid Volume", 00:19:35.926 "block_size": 512, 00:19:35.926 "num_blocks": 262144, 00:19:35.926 "uuid": "5dca1193-5c3c-49c4-ac39-f2617a335251", 00:19:35.926 "assigned_rate_limits": { 00:19:35.926 "rw_ios_per_sec": 0, 00:19:35.926 "rw_mbytes_per_sec": 0, 00:19:35.926 "r_mbytes_per_sec": 0, 00:19:35.926 "w_mbytes_per_sec": 0 00:19:35.926 }, 00:19:35.926 "claimed": false, 00:19:35.926 "zoned": false, 00:19:35.926 "supported_io_types": { 00:19:35.926 "read": true, 00:19:35.926 "write": true, 00:19:35.926 "unmap": true, 00:19:35.926 "flush": true, 00:19:35.926 "reset": true, 00:19:35.926 "nvme_admin": false, 00:19:35.926 "nvme_io": false, 00:19:35.926 "nvme_io_md": false, 00:19:35.926 "write_zeroes": true, 00:19:35.926 "zcopy": false, 00:19:35.926 "get_zone_info": false, 00:19:35.926 "zone_management": false, 00:19:35.926 "zone_append": false, 00:19:35.926 "compare": false, 00:19:35.926 "compare_and_write": false, 00:19:35.926 "abort": false, 00:19:35.926 "seek_hole": false, 00:19:35.926 "seek_data": false, 00:19:35.926 "copy": false, 00:19:35.926 "nvme_iov_md": false 00:19:35.926 }, 00:19:35.926 "memory_domains": [ 00:19:35.926 { 00:19:35.926 "dma_device_id": "system", 00:19:35.926 "dma_device_type": 1 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.926 "dma_device_type": 2 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "dma_device_id": "system", 00:19:35.926 "dma_device_type": 1 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.926 "dma_device_type": 2 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "dma_device_id": "system", 00:19:35.926 "dma_device_type": 1 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.926 "dma_device_type": 2 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "dma_device_id": "system", 00:19:35.926 "dma_device_type": 1 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.926 "dma_device_type": 2 00:19:35.926 } 00:19:35.926 ], 00:19:35.926 "driver_specific": { 00:19:35.926 "raid": { 00:19:35.926 "uuid": "5dca1193-5c3c-49c4-ac39-f2617a335251", 00:19:35.926 "strip_size_kb": 64, 00:19:35.926 "state": "online", 00:19:35.926 "raid_level": "concat", 00:19:35.926 "superblock": false, 00:19:35.926 "num_base_bdevs": 4, 00:19:35.926 "num_base_bdevs_discovered": 4, 00:19:35.926 "num_base_bdevs_operational": 4, 00:19:35.926 "base_bdevs_list": [ 00:19:35.926 { 00:19:35.926 "name": "NewBaseBdev", 00:19:35.926 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:35.926 "is_configured": true, 00:19:35.926 "data_offset": 0, 00:19:35.926 "data_size": 65536 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "name": "BaseBdev2", 00:19:35.926 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:35.926 "is_configured": true, 00:19:35.926 "data_offset": 0, 00:19:35.926 "data_size": 65536 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "name": "BaseBdev3", 00:19:35.926 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:35.926 "is_configured": true, 00:19:35.926 "data_offset": 0, 00:19:35.926 "data_size": 65536 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "name": "BaseBdev4", 00:19:35.926 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:35.926 "is_configured": true, 00:19:35.926 "data_offset": 0, 00:19:35.926 "data_size": 65536 00:19:35.926 } 00:19:35.926 ] 00:19:35.926 } 00:19:35.926 } 00:19:35.926 }' 00:19:35.926 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:35.926 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:35.926 BaseBdev2 00:19:35.926 BaseBdev3 00:19:35.926 BaseBdev4' 00:19:35.926 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:35.926 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:35.926 10:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:35.926 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:35.926 "name": "NewBaseBdev", 00:19:35.926 "aliases": [ 00:19:35.926 "cef5aeaa-6fbe-45e1-b776-7cd8aff52568" 00:19:35.926 ], 00:19:35.926 "product_name": "Malloc disk", 00:19:35.926 "block_size": 512, 00:19:35.926 "num_blocks": 65536, 00:19:35.926 "uuid": "cef5aeaa-6fbe-45e1-b776-7cd8aff52568", 00:19:35.926 "assigned_rate_limits": { 00:19:35.926 "rw_ios_per_sec": 0, 00:19:35.926 "rw_mbytes_per_sec": 0, 00:19:35.926 "r_mbytes_per_sec": 0, 00:19:35.926 "w_mbytes_per_sec": 0 00:19:35.926 }, 00:19:35.926 "claimed": true, 00:19:35.926 "claim_type": "exclusive_write", 00:19:35.926 "zoned": false, 00:19:35.926 "supported_io_types": { 00:19:35.926 "read": true, 00:19:35.926 "write": true, 00:19:35.926 "unmap": true, 00:19:35.926 "flush": true, 00:19:35.926 "reset": true, 00:19:35.926 "nvme_admin": false, 00:19:35.926 "nvme_io": false, 00:19:35.926 "nvme_io_md": false, 00:19:35.926 "write_zeroes": true, 00:19:35.926 "zcopy": true, 00:19:35.926 "get_zone_info": false, 00:19:35.926 "zone_management": false, 00:19:35.926 "zone_append": false, 00:19:35.926 "compare": false, 00:19:35.926 "compare_and_write": false, 00:19:35.926 "abort": true, 00:19:35.926 "seek_hole": false, 00:19:35.926 "seek_data": false, 00:19:35.926 "copy": true, 00:19:35.926 "nvme_iov_md": false 00:19:35.926 }, 00:19:35.926 "memory_domains": [ 00:19:35.926 { 00:19:35.926 "dma_device_id": "system", 00:19:35.926 "dma_device_type": 1 00:19:35.926 }, 00:19:35.926 { 00:19:35.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.926 "dma_device_type": 2 00:19:35.926 } 00:19:35.926 ], 00:19:35.926 "driver_specific": {} 00:19:35.926 }' 00:19:35.926 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.185 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.185 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:36.185 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.185 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.185 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:36.185 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.185 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.185 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:36.185 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.444 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.444 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:36.444 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:36.444 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:36.444 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:36.702 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:36.702 "name": "BaseBdev2", 00:19:36.702 "aliases": [ 00:19:36.702 "5de64052-d91e-42d2-bd0d-f2b183a2aeab" 00:19:36.702 ], 00:19:36.702 "product_name": "Malloc disk", 00:19:36.702 "block_size": 512, 00:19:36.702 "num_blocks": 65536, 00:19:36.702 "uuid": "5de64052-d91e-42d2-bd0d-f2b183a2aeab", 00:19:36.702 "assigned_rate_limits": { 00:19:36.702 "rw_ios_per_sec": 0, 00:19:36.702 "rw_mbytes_per_sec": 0, 00:19:36.702 "r_mbytes_per_sec": 0, 00:19:36.702 "w_mbytes_per_sec": 0 00:19:36.702 }, 00:19:36.702 "claimed": true, 00:19:36.702 "claim_type": "exclusive_write", 00:19:36.702 "zoned": false, 00:19:36.702 "supported_io_types": { 00:19:36.702 "read": true, 00:19:36.702 "write": true, 00:19:36.702 "unmap": true, 00:19:36.702 "flush": true, 00:19:36.702 "reset": true, 00:19:36.702 "nvme_admin": false, 00:19:36.702 "nvme_io": false, 00:19:36.702 "nvme_io_md": false, 00:19:36.702 "write_zeroes": true, 00:19:36.702 "zcopy": true, 00:19:36.702 "get_zone_info": false, 00:19:36.702 "zone_management": false, 00:19:36.702 "zone_append": false, 00:19:36.702 "compare": false, 00:19:36.702 "compare_and_write": false, 00:19:36.702 "abort": true, 00:19:36.702 "seek_hole": false, 00:19:36.702 "seek_data": false, 00:19:36.702 "copy": true, 00:19:36.702 "nvme_iov_md": false 00:19:36.702 }, 00:19:36.702 "memory_domains": [ 00:19:36.702 { 00:19:36.703 "dma_device_id": "system", 00:19:36.703 "dma_device_type": 1 00:19:36.703 }, 00:19:36.703 { 00:19:36.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.703 "dma_device_type": 2 00:19:36.703 } 00:19:36.703 ], 00:19:36.703 "driver_specific": {} 00:19:36.703 }' 00:19:36.703 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.703 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.703 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:36.703 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.703 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.703 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:36.703 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.703 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.961 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:36.961 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.961 10:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.961 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:36.961 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:36.961 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:36.961 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:37.219 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:37.219 "name": "BaseBdev3", 00:19:37.219 "aliases": [ 00:19:37.219 "680cc2e0-3834-46c9-8d27-1f2d43d0b8de" 00:19:37.219 ], 00:19:37.219 "product_name": "Malloc disk", 00:19:37.219 "block_size": 512, 00:19:37.219 "num_blocks": 65536, 00:19:37.219 "uuid": "680cc2e0-3834-46c9-8d27-1f2d43d0b8de", 00:19:37.219 "assigned_rate_limits": { 00:19:37.219 "rw_ios_per_sec": 0, 00:19:37.219 "rw_mbytes_per_sec": 0, 00:19:37.219 "r_mbytes_per_sec": 0, 00:19:37.219 "w_mbytes_per_sec": 0 00:19:37.219 }, 00:19:37.219 "claimed": true, 00:19:37.219 "claim_type": "exclusive_write", 00:19:37.219 "zoned": false, 00:19:37.219 "supported_io_types": { 00:19:37.219 "read": true, 00:19:37.219 "write": true, 00:19:37.219 "unmap": true, 00:19:37.219 "flush": true, 00:19:37.219 "reset": true, 00:19:37.219 "nvme_admin": false, 00:19:37.219 "nvme_io": false, 00:19:37.219 "nvme_io_md": false, 00:19:37.219 "write_zeroes": true, 00:19:37.219 "zcopy": true, 00:19:37.219 "get_zone_info": false, 00:19:37.219 "zone_management": false, 00:19:37.219 "zone_append": false, 00:19:37.219 "compare": false, 00:19:37.219 "compare_and_write": false, 00:19:37.219 "abort": true, 00:19:37.219 "seek_hole": false, 00:19:37.219 "seek_data": false, 00:19:37.219 "copy": true, 00:19:37.219 "nvme_iov_md": false 00:19:37.219 }, 00:19:37.219 "memory_domains": [ 00:19:37.219 { 00:19:37.219 "dma_device_id": "system", 00:19:37.219 "dma_device_type": 1 00:19:37.219 }, 00:19:37.219 { 00:19:37.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.219 "dma_device_type": 2 00:19:37.219 } 00:19:37.219 ], 00:19:37.219 "driver_specific": {} 00:19:37.219 }' 00:19:37.219 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.219 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.219 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:37.219 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.219 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:37.486 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:37.760 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:37.760 "name": "BaseBdev4", 00:19:37.760 "aliases": [ 00:19:37.760 "7f733969-e750-4066-aa67-519c1a288201" 00:19:37.760 ], 00:19:37.760 "product_name": "Malloc disk", 00:19:37.760 "block_size": 512, 00:19:37.760 "num_blocks": 65536, 00:19:37.760 "uuid": "7f733969-e750-4066-aa67-519c1a288201", 00:19:37.760 "assigned_rate_limits": { 00:19:37.760 "rw_ios_per_sec": 0, 00:19:37.760 "rw_mbytes_per_sec": 0, 00:19:37.760 "r_mbytes_per_sec": 0, 00:19:37.760 "w_mbytes_per_sec": 0 00:19:37.760 }, 00:19:37.760 "claimed": true, 00:19:37.760 "claim_type": "exclusive_write", 00:19:37.760 "zoned": false, 00:19:37.760 "supported_io_types": { 00:19:37.760 "read": true, 00:19:37.760 "write": true, 00:19:37.760 "unmap": true, 00:19:37.760 "flush": true, 00:19:37.760 "reset": true, 00:19:37.760 "nvme_admin": false, 00:19:37.760 "nvme_io": false, 00:19:37.760 "nvme_io_md": false, 00:19:37.760 "write_zeroes": true, 00:19:37.760 "zcopy": true, 00:19:37.760 "get_zone_info": false, 00:19:37.760 "zone_management": false, 00:19:37.760 "zone_append": false, 00:19:37.760 "compare": false, 00:19:37.760 "compare_and_write": false, 00:19:37.760 "abort": true, 00:19:37.760 "seek_hole": false, 00:19:37.760 "seek_data": false, 00:19:37.760 "copy": true, 00:19:37.760 "nvme_iov_md": false 00:19:37.760 }, 00:19:37.760 "memory_domains": [ 00:19:37.760 { 00:19:37.760 "dma_device_id": "system", 00:19:37.760 "dma_device_type": 1 00:19:37.760 }, 00:19:37.760 { 00:19:37.760 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.760 "dma_device_type": 2 00:19:37.760 } 00:19:37.760 ], 00:19:37.760 "driver_specific": {} 00:19:37.760 }' 00:19:37.760 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.760 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.035 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.035 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.035 10:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.035 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.035 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.035 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.035 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:38.035 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.035 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.035 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:38.035 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:38.293 [2024-07-12 10:46:13.407174] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:38.294 [2024-07-12 10:46:13.407203] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:38.294 [2024-07-12 10:46:13.407256] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:38.294 [2024-07-12 10:46:13.407314] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:38.294 [2024-07-12 10:46:13.407325] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1766040 name Existed_Raid, state offline 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2090615 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2090615 ']' 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2090615 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2090615 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2090615' 00:19:38.294 killing process with pid 2090615 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2090615 00:19:38.294 [2024-07-12 10:46:13.476451] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:38.294 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2090615 00:19:38.551 [2024-07-12 10:46:13.514898] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:38.551 10:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:38.551 00:19:38.551 real 0m32.079s 00:19:38.551 user 0m58.892s 00:19:38.551 sys 0m5.755s 00:19:38.551 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:38.551 10:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:38.551 ************************************ 00:19:38.551 END TEST raid_state_function_test 00:19:38.551 ************************************ 00:19:38.810 10:46:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:38.810 10:46:13 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:19:38.810 10:46:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:38.810 10:46:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:38.810 10:46:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:38.810 ************************************ 00:19:38.810 START TEST raid_state_function_test_sb 00:19:38.810 ************************************ 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2095367 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2095367' 00:19:38.810 Process raid pid: 2095367 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2095367 /var/tmp/spdk-raid.sock 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2095367 ']' 00:19:38.810 10:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:38.811 10:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:38.811 10:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:38.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:38.811 10:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:38.811 10:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:38.811 [2024-07-12 10:46:13.877707] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:19:38.811 [2024-07-12 10:46:13.877763] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:38.811 [2024-07-12 10:46:13.991846] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:39.070 [2024-07-12 10:46:14.096235] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:39.070 [2024-07-12 10:46:14.164426] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:39.070 [2024-07-12 10:46:14.164458] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:39.636 10:46:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:39.636 10:46:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:39.636 10:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:39.895 [2024-07-12 10:46:15.042441] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:39.895 [2024-07-12 10:46:15.042493] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:39.895 [2024-07-12 10:46:15.042505] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:39.895 [2024-07-12 10:46:15.042518] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:39.895 [2024-07-12 10:46:15.042527] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:39.895 [2024-07-12 10:46:15.042539] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:39.895 [2024-07-12 10:46:15.042548] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:39.895 [2024-07-12 10:46:15.042559] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.895 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:40.154 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:40.154 "name": "Existed_Raid", 00:19:40.154 "uuid": "ec61c2cd-d65f-4804-8a2d-59ebfa0b7b07", 00:19:40.154 "strip_size_kb": 64, 00:19:40.154 "state": "configuring", 00:19:40.154 "raid_level": "concat", 00:19:40.154 "superblock": true, 00:19:40.154 "num_base_bdevs": 4, 00:19:40.154 "num_base_bdevs_discovered": 0, 00:19:40.154 "num_base_bdevs_operational": 4, 00:19:40.154 "base_bdevs_list": [ 00:19:40.154 { 00:19:40.154 "name": "BaseBdev1", 00:19:40.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:40.154 "is_configured": false, 00:19:40.154 "data_offset": 0, 00:19:40.154 "data_size": 0 00:19:40.154 }, 00:19:40.154 { 00:19:40.154 "name": "BaseBdev2", 00:19:40.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:40.154 "is_configured": false, 00:19:40.154 "data_offset": 0, 00:19:40.154 "data_size": 0 00:19:40.154 }, 00:19:40.154 { 00:19:40.154 "name": "BaseBdev3", 00:19:40.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:40.154 "is_configured": false, 00:19:40.154 "data_offset": 0, 00:19:40.154 "data_size": 0 00:19:40.154 }, 00:19:40.154 { 00:19:40.154 "name": "BaseBdev4", 00:19:40.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:40.154 "is_configured": false, 00:19:40.154 "data_offset": 0, 00:19:40.154 "data_size": 0 00:19:40.154 } 00:19:40.154 ] 00:19:40.154 }' 00:19:40.154 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:40.154 10:46:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:41.092 10:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:41.092 [2024-07-12 10:46:16.149222] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:41.092 [2024-07-12 10:46:16.149255] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a8aaa0 name Existed_Raid, state configuring 00:19:41.092 10:46:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:41.352 [2024-07-12 10:46:16.389890] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:41.352 [2024-07-12 10:46:16.389917] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:41.352 [2024-07-12 10:46:16.389926] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:41.352 [2024-07-12 10:46:16.389938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:41.352 [2024-07-12 10:46:16.389946] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:41.352 [2024-07-12 10:46:16.389957] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:41.352 [2024-07-12 10:46:16.389966] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:41.352 [2024-07-12 10:46:16.389976] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:41.352 10:46:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:41.612 [2024-07-12 10:46:16.648449] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:41.612 BaseBdev1 00:19:41.612 10:46:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:41.612 10:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:41.612 10:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:41.612 10:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:41.612 10:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:41.612 10:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:41.612 10:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:41.871 10:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:42.129 [ 00:19:42.129 { 00:19:42.129 "name": "BaseBdev1", 00:19:42.129 "aliases": [ 00:19:42.129 "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b" 00:19:42.129 ], 00:19:42.130 "product_name": "Malloc disk", 00:19:42.130 "block_size": 512, 00:19:42.130 "num_blocks": 65536, 00:19:42.130 "uuid": "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b", 00:19:42.130 "assigned_rate_limits": { 00:19:42.130 "rw_ios_per_sec": 0, 00:19:42.130 "rw_mbytes_per_sec": 0, 00:19:42.130 "r_mbytes_per_sec": 0, 00:19:42.130 "w_mbytes_per_sec": 0 00:19:42.130 }, 00:19:42.130 "claimed": true, 00:19:42.130 "claim_type": "exclusive_write", 00:19:42.130 "zoned": false, 00:19:42.130 "supported_io_types": { 00:19:42.130 "read": true, 00:19:42.130 "write": true, 00:19:42.130 "unmap": true, 00:19:42.130 "flush": true, 00:19:42.130 "reset": true, 00:19:42.130 "nvme_admin": false, 00:19:42.130 "nvme_io": false, 00:19:42.130 "nvme_io_md": false, 00:19:42.130 "write_zeroes": true, 00:19:42.130 "zcopy": true, 00:19:42.130 "get_zone_info": false, 00:19:42.130 "zone_management": false, 00:19:42.130 "zone_append": false, 00:19:42.130 "compare": false, 00:19:42.130 "compare_and_write": false, 00:19:42.130 "abort": true, 00:19:42.130 "seek_hole": false, 00:19:42.130 "seek_data": false, 00:19:42.130 "copy": true, 00:19:42.130 "nvme_iov_md": false 00:19:42.130 }, 00:19:42.130 "memory_domains": [ 00:19:42.130 { 00:19:42.130 "dma_device_id": "system", 00:19:42.130 "dma_device_type": 1 00:19:42.130 }, 00:19:42.130 { 00:19:42.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.130 "dma_device_type": 2 00:19:42.130 } 00:19:42.130 ], 00:19:42.130 "driver_specific": {} 00:19:42.130 } 00:19:42.130 ] 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.130 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:42.388 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:42.389 "name": "Existed_Raid", 00:19:42.389 "uuid": "6e318306-444e-4631-be0d-1054a8a259a0", 00:19:42.389 "strip_size_kb": 64, 00:19:42.389 "state": "configuring", 00:19:42.389 "raid_level": "concat", 00:19:42.389 "superblock": true, 00:19:42.389 "num_base_bdevs": 4, 00:19:42.389 "num_base_bdevs_discovered": 1, 00:19:42.389 "num_base_bdevs_operational": 4, 00:19:42.389 "base_bdevs_list": [ 00:19:42.389 { 00:19:42.389 "name": "BaseBdev1", 00:19:42.389 "uuid": "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b", 00:19:42.389 "is_configured": true, 00:19:42.389 "data_offset": 2048, 00:19:42.389 "data_size": 63488 00:19:42.389 }, 00:19:42.389 { 00:19:42.389 "name": "BaseBdev2", 00:19:42.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:42.389 "is_configured": false, 00:19:42.389 "data_offset": 0, 00:19:42.389 "data_size": 0 00:19:42.389 }, 00:19:42.389 { 00:19:42.389 "name": "BaseBdev3", 00:19:42.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:42.389 "is_configured": false, 00:19:42.389 "data_offset": 0, 00:19:42.389 "data_size": 0 00:19:42.389 }, 00:19:42.389 { 00:19:42.389 "name": "BaseBdev4", 00:19:42.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:42.389 "is_configured": false, 00:19:42.389 "data_offset": 0, 00:19:42.389 "data_size": 0 00:19:42.389 } 00:19:42.389 ] 00:19:42.389 }' 00:19:42.389 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:42.389 10:46:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:42.955 10:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:43.214 [2024-07-12 10:46:18.220619] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:43.214 [2024-07-12 10:46:18.220664] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a8a310 name Existed_Raid, state configuring 00:19:43.214 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:43.474 [2024-07-12 10:46:18.465374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:43.474 [2024-07-12 10:46:18.466828] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:43.474 [2024-07-12 10:46:18.466863] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:43.474 [2024-07-12 10:46:18.466873] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:43.474 [2024-07-12 10:46:18.466885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:43.474 [2024-07-12 10:46:18.466894] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:43.474 [2024-07-12 10:46:18.466905] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.474 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:43.734 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:43.734 "name": "Existed_Raid", 00:19:43.734 "uuid": "070590be-6241-4d56-9ae0-d10bbd267578", 00:19:43.734 "strip_size_kb": 64, 00:19:43.734 "state": "configuring", 00:19:43.734 "raid_level": "concat", 00:19:43.734 "superblock": true, 00:19:43.734 "num_base_bdevs": 4, 00:19:43.734 "num_base_bdevs_discovered": 1, 00:19:43.734 "num_base_bdevs_operational": 4, 00:19:43.734 "base_bdevs_list": [ 00:19:43.734 { 00:19:43.734 "name": "BaseBdev1", 00:19:43.734 "uuid": "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b", 00:19:43.734 "is_configured": true, 00:19:43.734 "data_offset": 2048, 00:19:43.734 "data_size": 63488 00:19:43.734 }, 00:19:43.734 { 00:19:43.734 "name": "BaseBdev2", 00:19:43.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.734 "is_configured": false, 00:19:43.734 "data_offset": 0, 00:19:43.734 "data_size": 0 00:19:43.734 }, 00:19:43.734 { 00:19:43.734 "name": "BaseBdev3", 00:19:43.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.734 "is_configured": false, 00:19:43.734 "data_offset": 0, 00:19:43.734 "data_size": 0 00:19:43.734 }, 00:19:43.734 { 00:19:43.734 "name": "BaseBdev4", 00:19:43.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.734 "is_configured": false, 00:19:43.734 "data_offset": 0, 00:19:43.734 "data_size": 0 00:19:43.734 } 00:19:43.734 ] 00:19:43.734 }' 00:19:43.734 10:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:43.734 10:46:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:44.303 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:44.562 [2024-07-12 10:46:19.547611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:44.562 BaseBdev2 00:19:44.562 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:44.562 10:46:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:44.562 10:46:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:44.562 10:46:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:44.562 10:46:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:44.562 10:46:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:44.562 10:46:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:44.562 10:46:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:44.821 [ 00:19:44.821 { 00:19:44.821 "name": "BaseBdev2", 00:19:44.821 "aliases": [ 00:19:44.821 "1e14ecac-afce-4bb9-8999-a4501a567817" 00:19:44.821 ], 00:19:44.821 "product_name": "Malloc disk", 00:19:44.821 "block_size": 512, 00:19:44.821 "num_blocks": 65536, 00:19:44.821 "uuid": "1e14ecac-afce-4bb9-8999-a4501a567817", 00:19:44.821 "assigned_rate_limits": { 00:19:44.821 "rw_ios_per_sec": 0, 00:19:44.821 "rw_mbytes_per_sec": 0, 00:19:44.821 "r_mbytes_per_sec": 0, 00:19:44.821 "w_mbytes_per_sec": 0 00:19:44.821 }, 00:19:44.821 "claimed": true, 00:19:44.821 "claim_type": "exclusive_write", 00:19:44.821 "zoned": false, 00:19:44.821 "supported_io_types": { 00:19:44.821 "read": true, 00:19:44.821 "write": true, 00:19:44.821 "unmap": true, 00:19:44.821 "flush": true, 00:19:44.821 "reset": true, 00:19:44.821 "nvme_admin": false, 00:19:44.821 "nvme_io": false, 00:19:44.821 "nvme_io_md": false, 00:19:44.821 "write_zeroes": true, 00:19:44.821 "zcopy": true, 00:19:44.821 "get_zone_info": false, 00:19:44.821 "zone_management": false, 00:19:44.821 "zone_append": false, 00:19:44.821 "compare": false, 00:19:44.821 "compare_and_write": false, 00:19:44.821 "abort": true, 00:19:44.821 "seek_hole": false, 00:19:44.821 "seek_data": false, 00:19:44.821 "copy": true, 00:19:44.821 "nvme_iov_md": false 00:19:44.821 }, 00:19:44.822 "memory_domains": [ 00:19:44.822 { 00:19:44.822 "dma_device_id": "system", 00:19:44.822 "dma_device_type": 1 00:19:44.822 }, 00:19:44.822 { 00:19:44.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.822 "dma_device_type": 2 00:19:44.822 } 00:19:44.822 ], 00:19:44.822 "driver_specific": {} 00:19:44.822 } 00:19:44.822 ] 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:44.822 10:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.080 10:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:45.080 "name": "Existed_Raid", 00:19:45.080 "uuid": "070590be-6241-4d56-9ae0-d10bbd267578", 00:19:45.080 "strip_size_kb": 64, 00:19:45.080 "state": "configuring", 00:19:45.080 "raid_level": "concat", 00:19:45.080 "superblock": true, 00:19:45.080 "num_base_bdevs": 4, 00:19:45.080 "num_base_bdevs_discovered": 2, 00:19:45.080 "num_base_bdevs_operational": 4, 00:19:45.080 "base_bdevs_list": [ 00:19:45.080 { 00:19:45.080 "name": "BaseBdev1", 00:19:45.080 "uuid": "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b", 00:19:45.080 "is_configured": true, 00:19:45.080 "data_offset": 2048, 00:19:45.080 "data_size": 63488 00:19:45.080 }, 00:19:45.080 { 00:19:45.080 "name": "BaseBdev2", 00:19:45.080 "uuid": "1e14ecac-afce-4bb9-8999-a4501a567817", 00:19:45.080 "is_configured": true, 00:19:45.080 "data_offset": 2048, 00:19:45.080 "data_size": 63488 00:19:45.080 }, 00:19:45.080 { 00:19:45.080 "name": "BaseBdev3", 00:19:45.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.080 "is_configured": false, 00:19:45.080 "data_offset": 0, 00:19:45.080 "data_size": 0 00:19:45.080 }, 00:19:45.080 { 00:19:45.080 "name": "BaseBdev4", 00:19:45.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.080 "is_configured": false, 00:19:45.080 "data_offset": 0, 00:19:45.081 "data_size": 0 00:19:45.081 } 00:19:45.081 ] 00:19:45.081 }' 00:19:45.081 10:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:45.081 10:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:45.649 10:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:45.908 [2024-07-12 10:46:20.996113] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:45.908 BaseBdev3 00:19:45.908 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:45.908 10:46:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:45.908 10:46:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:45.908 10:46:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:45.908 10:46:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:45.908 10:46:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:45.908 10:46:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:46.167 10:46:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:46.426 [ 00:19:46.426 { 00:19:46.426 "name": "BaseBdev3", 00:19:46.426 "aliases": [ 00:19:46.426 "4fd368e7-4ad7-4c57-9745-6a49d8b30c50" 00:19:46.426 ], 00:19:46.426 "product_name": "Malloc disk", 00:19:46.426 "block_size": 512, 00:19:46.426 "num_blocks": 65536, 00:19:46.426 "uuid": "4fd368e7-4ad7-4c57-9745-6a49d8b30c50", 00:19:46.426 "assigned_rate_limits": { 00:19:46.426 "rw_ios_per_sec": 0, 00:19:46.426 "rw_mbytes_per_sec": 0, 00:19:46.426 "r_mbytes_per_sec": 0, 00:19:46.426 "w_mbytes_per_sec": 0 00:19:46.426 }, 00:19:46.426 "claimed": true, 00:19:46.426 "claim_type": "exclusive_write", 00:19:46.426 "zoned": false, 00:19:46.426 "supported_io_types": { 00:19:46.426 "read": true, 00:19:46.426 "write": true, 00:19:46.426 "unmap": true, 00:19:46.426 "flush": true, 00:19:46.426 "reset": true, 00:19:46.426 "nvme_admin": false, 00:19:46.426 "nvme_io": false, 00:19:46.426 "nvme_io_md": false, 00:19:46.426 "write_zeroes": true, 00:19:46.426 "zcopy": true, 00:19:46.426 "get_zone_info": false, 00:19:46.426 "zone_management": false, 00:19:46.426 "zone_append": false, 00:19:46.426 "compare": false, 00:19:46.426 "compare_and_write": false, 00:19:46.426 "abort": true, 00:19:46.426 "seek_hole": false, 00:19:46.426 "seek_data": false, 00:19:46.426 "copy": true, 00:19:46.426 "nvme_iov_md": false 00:19:46.426 }, 00:19:46.426 "memory_domains": [ 00:19:46.426 { 00:19:46.426 "dma_device_id": "system", 00:19:46.426 "dma_device_type": 1 00:19:46.426 }, 00:19:46.426 { 00:19:46.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.426 "dma_device_type": 2 00:19:46.426 } 00:19:46.426 ], 00:19:46.426 "driver_specific": {} 00:19:46.426 } 00:19:46.426 ] 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.426 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:46.685 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.685 "name": "Existed_Raid", 00:19:46.685 "uuid": "070590be-6241-4d56-9ae0-d10bbd267578", 00:19:46.685 "strip_size_kb": 64, 00:19:46.685 "state": "configuring", 00:19:46.685 "raid_level": "concat", 00:19:46.685 "superblock": true, 00:19:46.685 "num_base_bdevs": 4, 00:19:46.685 "num_base_bdevs_discovered": 3, 00:19:46.685 "num_base_bdevs_operational": 4, 00:19:46.685 "base_bdevs_list": [ 00:19:46.685 { 00:19:46.685 "name": "BaseBdev1", 00:19:46.685 "uuid": "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b", 00:19:46.685 "is_configured": true, 00:19:46.685 "data_offset": 2048, 00:19:46.685 "data_size": 63488 00:19:46.685 }, 00:19:46.685 { 00:19:46.685 "name": "BaseBdev2", 00:19:46.685 "uuid": "1e14ecac-afce-4bb9-8999-a4501a567817", 00:19:46.685 "is_configured": true, 00:19:46.685 "data_offset": 2048, 00:19:46.685 "data_size": 63488 00:19:46.685 }, 00:19:46.685 { 00:19:46.685 "name": "BaseBdev3", 00:19:46.685 "uuid": "4fd368e7-4ad7-4c57-9745-6a49d8b30c50", 00:19:46.685 "is_configured": true, 00:19:46.685 "data_offset": 2048, 00:19:46.685 "data_size": 63488 00:19:46.685 }, 00:19:46.685 { 00:19:46.685 "name": "BaseBdev4", 00:19:46.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.685 "is_configured": false, 00:19:46.685 "data_offset": 0, 00:19:46.685 "data_size": 0 00:19:46.685 } 00:19:46.685 ] 00:19:46.685 }' 00:19:46.685 10:46:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.685 10:46:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:47.251 10:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:47.509 [2024-07-12 10:46:22.564003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:47.509 [2024-07-12 10:46:22.564181] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a8b350 00:19:47.509 [2024-07-12 10:46:22.564196] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:47.509 [2024-07-12 10:46:22.564376] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a8b020 00:19:47.509 [2024-07-12 10:46:22.564510] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a8b350 00:19:47.509 [2024-07-12 10:46:22.564521] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a8b350 00:19:47.509 [2024-07-12 10:46:22.564613] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:47.509 BaseBdev4 00:19:47.509 10:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:47.509 10:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:47.509 10:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:47.509 10:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:47.509 10:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:47.509 10:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:47.509 10:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:47.767 10:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:48.025 [ 00:19:48.025 { 00:19:48.025 "name": "BaseBdev4", 00:19:48.025 "aliases": [ 00:19:48.025 "38b30c97-99cc-4d08-8883-c164096290de" 00:19:48.025 ], 00:19:48.025 "product_name": "Malloc disk", 00:19:48.025 "block_size": 512, 00:19:48.025 "num_blocks": 65536, 00:19:48.025 "uuid": "38b30c97-99cc-4d08-8883-c164096290de", 00:19:48.025 "assigned_rate_limits": { 00:19:48.025 "rw_ios_per_sec": 0, 00:19:48.025 "rw_mbytes_per_sec": 0, 00:19:48.025 "r_mbytes_per_sec": 0, 00:19:48.025 "w_mbytes_per_sec": 0 00:19:48.025 }, 00:19:48.025 "claimed": true, 00:19:48.025 "claim_type": "exclusive_write", 00:19:48.025 "zoned": false, 00:19:48.025 "supported_io_types": { 00:19:48.025 "read": true, 00:19:48.025 "write": true, 00:19:48.025 "unmap": true, 00:19:48.025 "flush": true, 00:19:48.025 "reset": true, 00:19:48.025 "nvme_admin": false, 00:19:48.025 "nvme_io": false, 00:19:48.025 "nvme_io_md": false, 00:19:48.025 "write_zeroes": true, 00:19:48.025 "zcopy": true, 00:19:48.025 "get_zone_info": false, 00:19:48.025 "zone_management": false, 00:19:48.025 "zone_append": false, 00:19:48.025 "compare": false, 00:19:48.025 "compare_and_write": false, 00:19:48.025 "abort": true, 00:19:48.025 "seek_hole": false, 00:19:48.025 "seek_data": false, 00:19:48.025 "copy": true, 00:19:48.025 "nvme_iov_md": false 00:19:48.025 }, 00:19:48.025 "memory_domains": [ 00:19:48.025 { 00:19:48.025 "dma_device_id": "system", 00:19:48.025 "dma_device_type": 1 00:19:48.025 }, 00:19:48.025 { 00:19:48.025 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.025 "dma_device_type": 2 00:19:48.025 } 00:19:48.025 ], 00:19:48.025 "driver_specific": {} 00:19:48.025 } 00:19:48.025 ] 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.025 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:48.282 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.282 "name": "Existed_Raid", 00:19:48.282 "uuid": "070590be-6241-4d56-9ae0-d10bbd267578", 00:19:48.282 "strip_size_kb": 64, 00:19:48.282 "state": "online", 00:19:48.282 "raid_level": "concat", 00:19:48.282 "superblock": true, 00:19:48.282 "num_base_bdevs": 4, 00:19:48.282 "num_base_bdevs_discovered": 4, 00:19:48.282 "num_base_bdevs_operational": 4, 00:19:48.282 "base_bdevs_list": [ 00:19:48.282 { 00:19:48.282 "name": "BaseBdev1", 00:19:48.282 "uuid": "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b", 00:19:48.282 "is_configured": true, 00:19:48.282 "data_offset": 2048, 00:19:48.282 "data_size": 63488 00:19:48.282 }, 00:19:48.282 { 00:19:48.282 "name": "BaseBdev2", 00:19:48.282 "uuid": "1e14ecac-afce-4bb9-8999-a4501a567817", 00:19:48.282 "is_configured": true, 00:19:48.282 "data_offset": 2048, 00:19:48.282 "data_size": 63488 00:19:48.282 }, 00:19:48.282 { 00:19:48.282 "name": "BaseBdev3", 00:19:48.282 "uuid": "4fd368e7-4ad7-4c57-9745-6a49d8b30c50", 00:19:48.282 "is_configured": true, 00:19:48.282 "data_offset": 2048, 00:19:48.282 "data_size": 63488 00:19:48.282 }, 00:19:48.282 { 00:19:48.282 "name": "BaseBdev4", 00:19:48.282 "uuid": "38b30c97-99cc-4d08-8883-c164096290de", 00:19:48.282 "is_configured": true, 00:19:48.282 "data_offset": 2048, 00:19:48.282 "data_size": 63488 00:19:48.282 } 00:19:48.282 ] 00:19:48.282 }' 00:19:48.282 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.282 10:46:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:48.876 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:48.876 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:48.876 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:48.876 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:48.876 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:48.876 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:48.876 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:48.876 10:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:49.134 [2024-07-12 10:46:24.092566] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:49.134 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:49.134 "name": "Existed_Raid", 00:19:49.134 "aliases": [ 00:19:49.134 "070590be-6241-4d56-9ae0-d10bbd267578" 00:19:49.134 ], 00:19:49.134 "product_name": "Raid Volume", 00:19:49.134 "block_size": 512, 00:19:49.134 "num_blocks": 253952, 00:19:49.134 "uuid": "070590be-6241-4d56-9ae0-d10bbd267578", 00:19:49.134 "assigned_rate_limits": { 00:19:49.134 "rw_ios_per_sec": 0, 00:19:49.134 "rw_mbytes_per_sec": 0, 00:19:49.134 "r_mbytes_per_sec": 0, 00:19:49.134 "w_mbytes_per_sec": 0 00:19:49.134 }, 00:19:49.134 "claimed": false, 00:19:49.134 "zoned": false, 00:19:49.134 "supported_io_types": { 00:19:49.134 "read": true, 00:19:49.134 "write": true, 00:19:49.134 "unmap": true, 00:19:49.134 "flush": true, 00:19:49.134 "reset": true, 00:19:49.134 "nvme_admin": false, 00:19:49.134 "nvme_io": false, 00:19:49.134 "nvme_io_md": false, 00:19:49.134 "write_zeroes": true, 00:19:49.134 "zcopy": false, 00:19:49.134 "get_zone_info": false, 00:19:49.134 "zone_management": false, 00:19:49.134 "zone_append": false, 00:19:49.134 "compare": false, 00:19:49.134 "compare_and_write": false, 00:19:49.134 "abort": false, 00:19:49.134 "seek_hole": false, 00:19:49.134 "seek_data": false, 00:19:49.134 "copy": false, 00:19:49.134 "nvme_iov_md": false 00:19:49.134 }, 00:19:49.134 "memory_domains": [ 00:19:49.134 { 00:19:49.134 "dma_device_id": "system", 00:19:49.134 "dma_device_type": 1 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.134 "dma_device_type": 2 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "dma_device_id": "system", 00:19:49.134 "dma_device_type": 1 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.134 "dma_device_type": 2 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "dma_device_id": "system", 00:19:49.134 "dma_device_type": 1 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.134 "dma_device_type": 2 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "dma_device_id": "system", 00:19:49.134 "dma_device_type": 1 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.134 "dma_device_type": 2 00:19:49.134 } 00:19:49.134 ], 00:19:49.134 "driver_specific": { 00:19:49.134 "raid": { 00:19:49.134 "uuid": "070590be-6241-4d56-9ae0-d10bbd267578", 00:19:49.134 "strip_size_kb": 64, 00:19:49.134 "state": "online", 00:19:49.134 "raid_level": "concat", 00:19:49.134 "superblock": true, 00:19:49.134 "num_base_bdevs": 4, 00:19:49.134 "num_base_bdevs_discovered": 4, 00:19:49.134 "num_base_bdevs_operational": 4, 00:19:49.134 "base_bdevs_list": [ 00:19:49.134 { 00:19:49.134 "name": "BaseBdev1", 00:19:49.134 "uuid": "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b", 00:19:49.134 "is_configured": true, 00:19:49.134 "data_offset": 2048, 00:19:49.134 "data_size": 63488 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "name": "BaseBdev2", 00:19:49.134 "uuid": "1e14ecac-afce-4bb9-8999-a4501a567817", 00:19:49.134 "is_configured": true, 00:19:49.134 "data_offset": 2048, 00:19:49.134 "data_size": 63488 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "name": "BaseBdev3", 00:19:49.134 "uuid": "4fd368e7-4ad7-4c57-9745-6a49d8b30c50", 00:19:49.134 "is_configured": true, 00:19:49.134 "data_offset": 2048, 00:19:49.134 "data_size": 63488 00:19:49.134 }, 00:19:49.134 { 00:19:49.134 "name": "BaseBdev4", 00:19:49.134 "uuid": "38b30c97-99cc-4d08-8883-c164096290de", 00:19:49.134 "is_configured": true, 00:19:49.134 "data_offset": 2048, 00:19:49.134 "data_size": 63488 00:19:49.134 } 00:19:49.134 ] 00:19:49.134 } 00:19:49.134 } 00:19:49.134 }' 00:19:49.134 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:49.134 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:49.134 BaseBdev2 00:19:49.134 BaseBdev3 00:19:49.134 BaseBdev4' 00:19:49.134 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:49.134 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:49.134 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:49.392 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:49.392 "name": "BaseBdev1", 00:19:49.392 "aliases": [ 00:19:49.392 "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b" 00:19:49.392 ], 00:19:49.392 "product_name": "Malloc disk", 00:19:49.392 "block_size": 512, 00:19:49.392 "num_blocks": 65536, 00:19:49.392 "uuid": "a1b38cdf-ce18-49cd-b7ed-9b9bec554d3b", 00:19:49.392 "assigned_rate_limits": { 00:19:49.392 "rw_ios_per_sec": 0, 00:19:49.392 "rw_mbytes_per_sec": 0, 00:19:49.392 "r_mbytes_per_sec": 0, 00:19:49.392 "w_mbytes_per_sec": 0 00:19:49.392 }, 00:19:49.392 "claimed": true, 00:19:49.392 "claim_type": "exclusive_write", 00:19:49.392 "zoned": false, 00:19:49.392 "supported_io_types": { 00:19:49.392 "read": true, 00:19:49.392 "write": true, 00:19:49.392 "unmap": true, 00:19:49.392 "flush": true, 00:19:49.392 "reset": true, 00:19:49.392 "nvme_admin": false, 00:19:49.392 "nvme_io": false, 00:19:49.392 "nvme_io_md": false, 00:19:49.392 "write_zeroes": true, 00:19:49.392 "zcopy": true, 00:19:49.392 "get_zone_info": false, 00:19:49.392 "zone_management": false, 00:19:49.392 "zone_append": false, 00:19:49.392 "compare": false, 00:19:49.392 "compare_and_write": false, 00:19:49.392 "abort": true, 00:19:49.392 "seek_hole": false, 00:19:49.392 "seek_data": false, 00:19:49.392 "copy": true, 00:19:49.392 "nvme_iov_md": false 00:19:49.392 }, 00:19:49.392 "memory_domains": [ 00:19:49.392 { 00:19:49.392 "dma_device_id": "system", 00:19:49.392 "dma_device_type": 1 00:19:49.392 }, 00:19:49.392 { 00:19:49.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.392 "dma_device_type": 2 00:19:49.392 } 00:19:49.392 ], 00:19:49.392 "driver_specific": {} 00:19:49.392 }' 00:19:49.392 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.392 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.392 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:49.392 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.392 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.392 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:49.392 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.650 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.650 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:49.650 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.650 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.650 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:49.650 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:49.650 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:49.650 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:49.908 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:49.908 "name": "BaseBdev2", 00:19:49.908 "aliases": [ 00:19:49.908 "1e14ecac-afce-4bb9-8999-a4501a567817" 00:19:49.908 ], 00:19:49.908 "product_name": "Malloc disk", 00:19:49.908 "block_size": 512, 00:19:49.908 "num_blocks": 65536, 00:19:49.908 "uuid": "1e14ecac-afce-4bb9-8999-a4501a567817", 00:19:49.908 "assigned_rate_limits": { 00:19:49.908 "rw_ios_per_sec": 0, 00:19:49.908 "rw_mbytes_per_sec": 0, 00:19:49.908 "r_mbytes_per_sec": 0, 00:19:49.908 "w_mbytes_per_sec": 0 00:19:49.908 }, 00:19:49.908 "claimed": true, 00:19:49.908 "claim_type": "exclusive_write", 00:19:49.908 "zoned": false, 00:19:49.908 "supported_io_types": { 00:19:49.908 "read": true, 00:19:49.908 "write": true, 00:19:49.908 "unmap": true, 00:19:49.908 "flush": true, 00:19:49.908 "reset": true, 00:19:49.908 "nvme_admin": false, 00:19:49.908 "nvme_io": false, 00:19:49.908 "nvme_io_md": false, 00:19:49.908 "write_zeroes": true, 00:19:49.908 "zcopy": true, 00:19:49.908 "get_zone_info": false, 00:19:49.908 "zone_management": false, 00:19:49.908 "zone_append": false, 00:19:49.908 "compare": false, 00:19:49.908 "compare_and_write": false, 00:19:49.908 "abort": true, 00:19:49.908 "seek_hole": false, 00:19:49.908 "seek_data": false, 00:19:49.908 "copy": true, 00:19:49.908 "nvme_iov_md": false 00:19:49.908 }, 00:19:49.908 "memory_domains": [ 00:19:49.908 { 00:19:49.908 "dma_device_id": "system", 00:19:49.908 "dma_device_type": 1 00:19:49.908 }, 00:19:49.908 { 00:19:49.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.908 "dma_device_type": 2 00:19:49.908 } 00:19:49.908 ], 00:19:49.908 "driver_specific": {} 00:19:49.908 }' 00:19:49.908 10:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.908 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.908 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:49.908 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:50.166 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:50.424 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:50.424 "name": "BaseBdev3", 00:19:50.424 "aliases": [ 00:19:50.424 "4fd368e7-4ad7-4c57-9745-6a49d8b30c50" 00:19:50.424 ], 00:19:50.424 "product_name": "Malloc disk", 00:19:50.424 "block_size": 512, 00:19:50.424 "num_blocks": 65536, 00:19:50.424 "uuid": "4fd368e7-4ad7-4c57-9745-6a49d8b30c50", 00:19:50.424 "assigned_rate_limits": { 00:19:50.424 "rw_ios_per_sec": 0, 00:19:50.424 "rw_mbytes_per_sec": 0, 00:19:50.424 "r_mbytes_per_sec": 0, 00:19:50.424 "w_mbytes_per_sec": 0 00:19:50.424 }, 00:19:50.424 "claimed": true, 00:19:50.424 "claim_type": "exclusive_write", 00:19:50.424 "zoned": false, 00:19:50.424 "supported_io_types": { 00:19:50.424 "read": true, 00:19:50.424 "write": true, 00:19:50.424 "unmap": true, 00:19:50.424 "flush": true, 00:19:50.424 "reset": true, 00:19:50.424 "nvme_admin": false, 00:19:50.424 "nvme_io": false, 00:19:50.424 "nvme_io_md": false, 00:19:50.424 "write_zeroes": true, 00:19:50.424 "zcopy": true, 00:19:50.424 "get_zone_info": false, 00:19:50.424 "zone_management": false, 00:19:50.424 "zone_append": false, 00:19:50.424 "compare": false, 00:19:50.424 "compare_and_write": false, 00:19:50.424 "abort": true, 00:19:50.424 "seek_hole": false, 00:19:50.424 "seek_data": false, 00:19:50.424 "copy": true, 00:19:50.424 "nvme_iov_md": false 00:19:50.424 }, 00:19:50.424 "memory_domains": [ 00:19:50.424 { 00:19:50.424 "dma_device_id": "system", 00:19:50.424 "dma_device_type": 1 00:19:50.424 }, 00:19:50.424 { 00:19:50.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.424 "dma_device_type": 2 00:19:50.424 } 00:19:50.424 ], 00:19:50.424 "driver_specific": {} 00:19:50.424 }' 00:19:50.424 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:50.682 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:50.682 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:50.682 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.682 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.682 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:50.682 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.682 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.682 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:50.682 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:50.940 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:50.940 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:50.940 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:50.940 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:50.940 10:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:51.198 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:51.198 "name": "BaseBdev4", 00:19:51.198 "aliases": [ 00:19:51.198 "38b30c97-99cc-4d08-8883-c164096290de" 00:19:51.198 ], 00:19:51.198 "product_name": "Malloc disk", 00:19:51.198 "block_size": 512, 00:19:51.198 "num_blocks": 65536, 00:19:51.198 "uuid": "38b30c97-99cc-4d08-8883-c164096290de", 00:19:51.198 "assigned_rate_limits": { 00:19:51.198 "rw_ios_per_sec": 0, 00:19:51.198 "rw_mbytes_per_sec": 0, 00:19:51.198 "r_mbytes_per_sec": 0, 00:19:51.198 "w_mbytes_per_sec": 0 00:19:51.198 }, 00:19:51.198 "claimed": true, 00:19:51.198 "claim_type": "exclusive_write", 00:19:51.198 "zoned": false, 00:19:51.198 "supported_io_types": { 00:19:51.198 "read": true, 00:19:51.198 "write": true, 00:19:51.198 "unmap": true, 00:19:51.198 "flush": true, 00:19:51.198 "reset": true, 00:19:51.198 "nvme_admin": false, 00:19:51.198 "nvme_io": false, 00:19:51.198 "nvme_io_md": false, 00:19:51.198 "write_zeroes": true, 00:19:51.198 "zcopy": true, 00:19:51.198 "get_zone_info": false, 00:19:51.198 "zone_management": false, 00:19:51.198 "zone_append": false, 00:19:51.198 "compare": false, 00:19:51.198 "compare_and_write": false, 00:19:51.198 "abort": true, 00:19:51.198 "seek_hole": false, 00:19:51.198 "seek_data": false, 00:19:51.198 "copy": true, 00:19:51.198 "nvme_iov_md": false 00:19:51.198 }, 00:19:51.198 "memory_domains": [ 00:19:51.198 { 00:19:51.198 "dma_device_id": "system", 00:19:51.198 "dma_device_type": 1 00:19:51.198 }, 00:19:51.198 { 00:19:51.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.198 "dma_device_type": 2 00:19:51.198 } 00:19:51.198 ], 00:19:51.198 "driver_specific": {} 00:19:51.198 }' 00:19:51.198 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.198 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.198 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:51.198 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.198 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.198 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:51.198 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.455 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.455 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:51.455 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.455 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.455 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:51.455 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:51.724 [2024-07-12 10:46:26.743376] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:51.724 [2024-07-12 10:46:26.743412] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:51.724 [2024-07-12 10:46:26.743463] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.724 10:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:51.999 10:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.999 "name": "Existed_Raid", 00:19:51.999 "uuid": "070590be-6241-4d56-9ae0-d10bbd267578", 00:19:51.999 "strip_size_kb": 64, 00:19:51.999 "state": "offline", 00:19:51.999 "raid_level": "concat", 00:19:51.999 "superblock": true, 00:19:51.999 "num_base_bdevs": 4, 00:19:51.999 "num_base_bdevs_discovered": 3, 00:19:51.999 "num_base_bdevs_operational": 3, 00:19:51.999 "base_bdevs_list": [ 00:19:51.999 { 00:19:51.999 "name": null, 00:19:51.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.999 "is_configured": false, 00:19:51.999 "data_offset": 2048, 00:19:51.999 "data_size": 63488 00:19:51.999 }, 00:19:51.999 { 00:19:51.999 "name": "BaseBdev2", 00:19:51.999 "uuid": "1e14ecac-afce-4bb9-8999-a4501a567817", 00:19:51.999 "is_configured": true, 00:19:51.999 "data_offset": 2048, 00:19:51.999 "data_size": 63488 00:19:51.999 }, 00:19:51.999 { 00:19:51.999 "name": "BaseBdev3", 00:19:51.999 "uuid": "4fd368e7-4ad7-4c57-9745-6a49d8b30c50", 00:19:51.999 "is_configured": true, 00:19:51.999 "data_offset": 2048, 00:19:51.999 "data_size": 63488 00:19:51.999 }, 00:19:51.999 { 00:19:51.999 "name": "BaseBdev4", 00:19:51.999 "uuid": "38b30c97-99cc-4d08-8883-c164096290de", 00:19:51.999 "is_configured": true, 00:19:51.999 "data_offset": 2048, 00:19:51.999 "data_size": 63488 00:19:51.999 } 00:19:51.999 ] 00:19:51.999 }' 00:19:51.999 10:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.999 10:46:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:52.564 10:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:52.564 10:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:52.564 10:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.564 10:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:52.822 10:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:52.822 10:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:52.822 10:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:52.822 [2024-07-12 10:46:28.008652] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:53.081 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:53.081 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:53.081 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.081 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:53.339 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:53.339 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:53.339 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:53.339 [2024-07-12 10:46:28.516564] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:53.597 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:53.597 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:53.597 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.597 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:53.855 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:53.855 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:53.855 10:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:53.855 [2024-07-12 10:46:29.022278] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:53.855 [2024-07-12 10:46:29.022325] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a8b350 name Existed_Raid, state offline 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:54.113 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:54.371 BaseBdev2 00:19:54.371 10:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:54.371 10:46:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:54.371 10:46:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:54.371 10:46:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:54.371 10:46:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:54.371 10:46:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:54.371 10:46:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:54.629 10:46:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:54.887 [ 00:19:54.887 { 00:19:54.887 "name": "BaseBdev2", 00:19:54.887 "aliases": [ 00:19:54.887 "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14" 00:19:54.887 ], 00:19:54.887 "product_name": "Malloc disk", 00:19:54.887 "block_size": 512, 00:19:54.887 "num_blocks": 65536, 00:19:54.887 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:19:54.887 "assigned_rate_limits": { 00:19:54.887 "rw_ios_per_sec": 0, 00:19:54.887 "rw_mbytes_per_sec": 0, 00:19:54.887 "r_mbytes_per_sec": 0, 00:19:54.887 "w_mbytes_per_sec": 0 00:19:54.887 }, 00:19:54.887 "claimed": false, 00:19:54.887 "zoned": false, 00:19:54.887 "supported_io_types": { 00:19:54.887 "read": true, 00:19:54.887 "write": true, 00:19:54.887 "unmap": true, 00:19:54.887 "flush": true, 00:19:54.887 "reset": true, 00:19:54.887 "nvme_admin": false, 00:19:54.887 "nvme_io": false, 00:19:54.887 "nvme_io_md": false, 00:19:54.887 "write_zeroes": true, 00:19:54.887 "zcopy": true, 00:19:54.887 "get_zone_info": false, 00:19:54.887 "zone_management": false, 00:19:54.887 "zone_append": false, 00:19:54.887 "compare": false, 00:19:54.887 "compare_and_write": false, 00:19:54.887 "abort": true, 00:19:54.887 "seek_hole": false, 00:19:54.887 "seek_data": false, 00:19:54.887 "copy": true, 00:19:54.887 "nvme_iov_md": false 00:19:54.887 }, 00:19:54.887 "memory_domains": [ 00:19:54.887 { 00:19:54.887 "dma_device_id": "system", 00:19:54.887 "dma_device_type": 1 00:19:54.887 }, 00:19:54.887 { 00:19:54.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.887 "dma_device_type": 2 00:19:54.887 } 00:19:54.887 ], 00:19:54.887 "driver_specific": {} 00:19:54.887 } 00:19:54.887 ] 00:19:54.887 10:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:54.887 10:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:54.887 10:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:54.887 10:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:55.144 BaseBdev3 00:19:55.144 10:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:55.144 10:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:55.144 10:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:55.144 10:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:55.144 10:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:55.144 10:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:55.144 10:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:55.401 10:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:55.658 [ 00:19:55.658 { 00:19:55.658 "name": "BaseBdev3", 00:19:55.658 "aliases": [ 00:19:55.658 "4228a065-55d7-4bb8-8271-81bb733a948e" 00:19:55.658 ], 00:19:55.658 "product_name": "Malloc disk", 00:19:55.658 "block_size": 512, 00:19:55.658 "num_blocks": 65536, 00:19:55.658 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:19:55.658 "assigned_rate_limits": { 00:19:55.658 "rw_ios_per_sec": 0, 00:19:55.658 "rw_mbytes_per_sec": 0, 00:19:55.658 "r_mbytes_per_sec": 0, 00:19:55.658 "w_mbytes_per_sec": 0 00:19:55.658 }, 00:19:55.658 "claimed": false, 00:19:55.658 "zoned": false, 00:19:55.658 "supported_io_types": { 00:19:55.658 "read": true, 00:19:55.658 "write": true, 00:19:55.658 "unmap": true, 00:19:55.658 "flush": true, 00:19:55.658 "reset": true, 00:19:55.658 "nvme_admin": false, 00:19:55.658 "nvme_io": false, 00:19:55.658 "nvme_io_md": false, 00:19:55.658 "write_zeroes": true, 00:19:55.658 "zcopy": true, 00:19:55.658 "get_zone_info": false, 00:19:55.658 "zone_management": false, 00:19:55.658 "zone_append": false, 00:19:55.658 "compare": false, 00:19:55.658 "compare_and_write": false, 00:19:55.658 "abort": true, 00:19:55.658 "seek_hole": false, 00:19:55.658 "seek_data": false, 00:19:55.658 "copy": true, 00:19:55.658 "nvme_iov_md": false 00:19:55.658 }, 00:19:55.658 "memory_domains": [ 00:19:55.658 { 00:19:55.658 "dma_device_id": "system", 00:19:55.658 "dma_device_type": 1 00:19:55.658 }, 00:19:55.658 { 00:19:55.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.658 "dma_device_type": 2 00:19:55.658 } 00:19:55.658 ], 00:19:55.658 "driver_specific": {} 00:19:55.658 } 00:19:55.658 ] 00:19:55.658 10:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:55.658 10:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:55.658 10:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:55.658 10:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:55.917 BaseBdev4 00:19:55.917 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:55.917 10:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:55.917 10:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:55.917 10:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:55.917 10:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:55.917 10:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:55.917 10:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:56.175 10:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:56.432 [ 00:19:56.432 { 00:19:56.432 "name": "BaseBdev4", 00:19:56.432 "aliases": [ 00:19:56.433 "06335a06-f8f1-466f-9fbd-db19f9d12c05" 00:19:56.433 ], 00:19:56.433 "product_name": "Malloc disk", 00:19:56.433 "block_size": 512, 00:19:56.433 "num_blocks": 65536, 00:19:56.433 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:19:56.433 "assigned_rate_limits": { 00:19:56.433 "rw_ios_per_sec": 0, 00:19:56.433 "rw_mbytes_per_sec": 0, 00:19:56.433 "r_mbytes_per_sec": 0, 00:19:56.433 "w_mbytes_per_sec": 0 00:19:56.433 }, 00:19:56.433 "claimed": false, 00:19:56.433 "zoned": false, 00:19:56.433 "supported_io_types": { 00:19:56.433 "read": true, 00:19:56.433 "write": true, 00:19:56.433 "unmap": true, 00:19:56.433 "flush": true, 00:19:56.433 "reset": true, 00:19:56.433 "nvme_admin": false, 00:19:56.433 "nvme_io": false, 00:19:56.433 "nvme_io_md": false, 00:19:56.433 "write_zeroes": true, 00:19:56.433 "zcopy": true, 00:19:56.433 "get_zone_info": false, 00:19:56.433 "zone_management": false, 00:19:56.433 "zone_append": false, 00:19:56.433 "compare": false, 00:19:56.433 "compare_and_write": false, 00:19:56.433 "abort": true, 00:19:56.433 "seek_hole": false, 00:19:56.433 "seek_data": false, 00:19:56.433 "copy": true, 00:19:56.433 "nvme_iov_md": false 00:19:56.433 }, 00:19:56.433 "memory_domains": [ 00:19:56.433 { 00:19:56.433 "dma_device_id": "system", 00:19:56.433 "dma_device_type": 1 00:19:56.433 }, 00:19:56.433 { 00:19:56.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.433 "dma_device_type": 2 00:19:56.433 } 00:19:56.433 ], 00:19:56.433 "driver_specific": {} 00:19:56.433 } 00:19:56.433 ] 00:19:56.433 10:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:56.433 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:56.433 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:56.433 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:56.690 [2024-07-12 10:46:31.780332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:56.690 [2024-07-12 10:46:31.780377] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:56.690 [2024-07-12 10:46:31.780395] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:56.690 [2024-07-12 10:46:31.781725] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:56.690 [2024-07-12 10:46:31.781774] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.690 10:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:56.949 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.949 "name": "Existed_Raid", 00:19:56.949 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:19:56.949 "strip_size_kb": 64, 00:19:56.949 "state": "configuring", 00:19:56.949 "raid_level": "concat", 00:19:56.949 "superblock": true, 00:19:56.949 "num_base_bdevs": 4, 00:19:56.949 "num_base_bdevs_discovered": 3, 00:19:56.949 "num_base_bdevs_operational": 4, 00:19:56.949 "base_bdevs_list": [ 00:19:56.949 { 00:19:56.949 "name": "BaseBdev1", 00:19:56.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.949 "is_configured": false, 00:19:56.949 "data_offset": 0, 00:19:56.949 "data_size": 0 00:19:56.949 }, 00:19:56.949 { 00:19:56.949 "name": "BaseBdev2", 00:19:56.949 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:19:56.949 "is_configured": true, 00:19:56.949 "data_offset": 2048, 00:19:56.949 "data_size": 63488 00:19:56.949 }, 00:19:56.949 { 00:19:56.949 "name": "BaseBdev3", 00:19:56.949 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:19:56.949 "is_configured": true, 00:19:56.949 "data_offset": 2048, 00:19:56.949 "data_size": 63488 00:19:56.949 }, 00:19:56.949 { 00:19:56.949 "name": "BaseBdev4", 00:19:56.949 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:19:56.949 "is_configured": true, 00:19:56.949 "data_offset": 2048, 00:19:56.949 "data_size": 63488 00:19:56.949 } 00:19:56.949 ] 00:19:56.949 }' 00:19:56.949 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.949 10:46:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:57.515 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:57.774 [2024-07-12 10:46:32.875185] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.774 10:46:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:58.032 10:46:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.032 "name": "Existed_Raid", 00:19:58.032 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:19:58.032 "strip_size_kb": 64, 00:19:58.032 "state": "configuring", 00:19:58.032 "raid_level": "concat", 00:19:58.032 "superblock": true, 00:19:58.032 "num_base_bdevs": 4, 00:19:58.032 "num_base_bdevs_discovered": 2, 00:19:58.032 "num_base_bdevs_operational": 4, 00:19:58.032 "base_bdevs_list": [ 00:19:58.032 { 00:19:58.032 "name": "BaseBdev1", 00:19:58.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.032 "is_configured": false, 00:19:58.032 "data_offset": 0, 00:19:58.032 "data_size": 0 00:19:58.032 }, 00:19:58.032 { 00:19:58.032 "name": null, 00:19:58.032 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:19:58.032 "is_configured": false, 00:19:58.032 "data_offset": 2048, 00:19:58.032 "data_size": 63488 00:19:58.032 }, 00:19:58.032 { 00:19:58.032 "name": "BaseBdev3", 00:19:58.032 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:19:58.032 "is_configured": true, 00:19:58.032 "data_offset": 2048, 00:19:58.032 "data_size": 63488 00:19:58.032 }, 00:19:58.032 { 00:19:58.032 "name": "BaseBdev4", 00:19:58.032 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:19:58.032 "is_configured": true, 00:19:58.032 "data_offset": 2048, 00:19:58.032 "data_size": 63488 00:19:58.032 } 00:19:58.032 ] 00:19:58.032 }' 00:19:58.032 10:46:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.032 10:46:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:58.598 10:46:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.598 10:46:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:58.855 10:46:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:58.855 10:46:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:59.114 [2024-07-12 10:46:34.155131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:59.114 BaseBdev1 00:19:59.114 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:59.114 10:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:59.114 10:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:59.114 10:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:59.114 10:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:59.114 10:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:59.114 10:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:59.373 10:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:59.631 [ 00:19:59.631 { 00:19:59.631 "name": "BaseBdev1", 00:19:59.631 "aliases": [ 00:19:59.631 "7127c52b-4792-48a7-85f0-7108f46099a4" 00:19:59.631 ], 00:19:59.631 "product_name": "Malloc disk", 00:19:59.631 "block_size": 512, 00:19:59.631 "num_blocks": 65536, 00:19:59.631 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:19:59.631 "assigned_rate_limits": { 00:19:59.631 "rw_ios_per_sec": 0, 00:19:59.631 "rw_mbytes_per_sec": 0, 00:19:59.631 "r_mbytes_per_sec": 0, 00:19:59.631 "w_mbytes_per_sec": 0 00:19:59.631 }, 00:19:59.631 "claimed": true, 00:19:59.631 "claim_type": "exclusive_write", 00:19:59.631 "zoned": false, 00:19:59.631 "supported_io_types": { 00:19:59.631 "read": true, 00:19:59.631 "write": true, 00:19:59.631 "unmap": true, 00:19:59.631 "flush": true, 00:19:59.631 "reset": true, 00:19:59.631 "nvme_admin": false, 00:19:59.631 "nvme_io": false, 00:19:59.631 "nvme_io_md": false, 00:19:59.631 "write_zeroes": true, 00:19:59.631 "zcopy": true, 00:19:59.631 "get_zone_info": false, 00:19:59.631 "zone_management": false, 00:19:59.631 "zone_append": false, 00:19:59.631 "compare": false, 00:19:59.631 "compare_and_write": false, 00:19:59.631 "abort": true, 00:19:59.631 "seek_hole": false, 00:19:59.631 "seek_data": false, 00:19:59.631 "copy": true, 00:19:59.631 "nvme_iov_md": false 00:19:59.631 }, 00:19:59.631 "memory_domains": [ 00:19:59.631 { 00:19:59.631 "dma_device_id": "system", 00:19:59.631 "dma_device_type": 1 00:19:59.631 }, 00:19:59.631 { 00:19:59.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:59.631 "dma_device_type": 2 00:19:59.631 } 00:19:59.631 ], 00:19:59.631 "driver_specific": {} 00:19:59.631 } 00:19:59.631 ] 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.631 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:59.889 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.889 "name": "Existed_Raid", 00:19:59.889 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:19:59.889 "strip_size_kb": 64, 00:19:59.889 "state": "configuring", 00:19:59.889 "raid_level": "concat", 00:19:59.889 "superblock": true, 00:19:59.889 "num_base_bdevs": 4, 00:19:59.889 "num_base_bdevs_discovered": 3, 00:19:59.889 "num_base_bdevs_operational": 4, 00:19:59.889 "base_bdevs_list": [ 00:19:59.889 { 00:19:59.889 "name": "BaseBdev1", 00:19:59.889 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:19:59.889 "is_configured": true, 00:19:59.889 "data_offset": 2048, 00:19:59.889 "data_size": 63488 00:19:59.889 }, 00:19:59.889 { 00:19:59.889 "name": null, 00:19:59.889 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:19:59.889 "is_configured": false, 00:19:59.889 "data_offset": 2048, 00:19:59.889 "data_size": 63488 00:19:59.889 }, 00:19:59.889 { 00:19:59.889 "name": "BaseBdev3", 00:19:59.889 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:19:59.889 "is_configured": true, 00:19:59.889 "data_offset": 2048, 00:19:59.889 "data_size": 63488 00:19:59.889 }, 00:19:59.889 { 00:19:59.889 "name": "BaseBdev4", 00:19:59.889 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:19:59.889 "is_configured": true, 00:19:59.889 "data_offset": 2048, 00:19:59.889 "data_size": 63488 00:19:59.889 } 00:19:59.889 ] 00:19:59.889 }' 00:19:59.889 10:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.889 10:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:00.456 10:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.456 10:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:00.714 10:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:00.714 10:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:00.973 [2024-07-12 10:46:35.996065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.973 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:01.231 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:01.231 "name": "Existed_Raid", 00:20:01.231 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:20:01.231 "strip_size_kb": 64, 00:20:01.231 "state": "configuring", 00:20:01.231 "raid_level": "concat", 00:20:01.231 "superblock": true, 00:20:01.231 "num_base_bdevs": 4, 00:20:01.231 "num_base_bdevs_discovered": 2, 00:20:01.231 "num_base_bdevs_operational": 4, 00:20:01.231 "base_bdevs_list": [ 00:20:01.231 { 00:20:01.231 "name": "BaseBdev1", 00:20:01.231 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:20:01.231 "is_configured": true, 00:20:01.231 "data_offset": 2048, 00:20:01.231 "data_size": 63488 00:20:01.231 }, 00:20:01.231 { 00:20:01.231 "name": null, 00:20:01.231 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:20:01.231 "is_configured": false, 00:20:01.231 "data_offset": 2048, 00:20:01.231 "data_size": 63488 00:20:01.231 }, 00:20:01.231 { 00:20:01.231 "name": null, 00:20:01.231 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:20:01.231 "is_configured": false, 00:20:01.231 "data_offset": 2048, 00:20:01.231 "data_size": 63488 00:20:01.231 }, 00:20:01.231 { 00:20:01.231 "name": "BaseBdev4", 00:20:01.231 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:20:01.231 "is_configured": true, 00:20:01.231 "data_offset": 2048, 00:20:01.231 "data_size": 63488 00:20:01.231 } 00:20:01.231 ] 00:20:01.231 }' 00:20:01.231 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:01.231 10:46:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:01.797 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.797 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:01.797 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:01.797 10:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:02.055 [2024-07-12 10:46:37.211353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.055 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:02.313 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.313 "name": "Existed_Raid", 00:20:02.313 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:20:02.313 "strip_size_kb": 64, 00:20:02.313 "state": "configuring", 00:20:02.313 "raid_level": "concat", 00:20:02.313 "superblock": true, 00:20:02.313 "num_base_bdevs": 4, 00:20:02.313 "num_base_bdevs_discovered": 3, 00:20:02.313 "num_base_bdevs_operational": 4, 00:20:02.313 "base_bdevs_list": [ 00:20:02.313 { 00:20:02.313 "name": "BaseBdev1", 00:20:02.313 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:20:02.313 "is_configured": true, 00:20:02.313 "data_offset": 2048, 00:20:02.313 "data_size": 63488 00:20:02.313 }, 00:20:02.313 { 00:20:02.313 "name": null, 00:20:02.313 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:20:02.313 "is_configured": false, 00:20:02.313 "data_offset": 2048, 00:20:02.313 "data_size": 63488 00:20:02.313 }, 00:20:02.313 { 00:20:02.313 "name": "BaseBdev3", 00:20:02.313 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:20:02.313 "is_configured": true, 00:20:02.313 "data_offset": 2048, 00:20:02.313 "data_size": 63488 00:20:02.313 }, 00:20:02.313 { 00:20:02.313 "name": "BaseBdev4", 00:20:02.313 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:20:02.313 "is_configured": true, 00:20:02.313 "data_offset": 2048, 00:20:02.313 "data_size": 63488 00:20:02.313 } 00:20:02.313 ] 00:20:02.313 }' 00:20:02.313 10:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.313 10:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:02.877 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.877 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:03.135 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:03.135 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:03.393 [2024-07-12 10:46:38.546898] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.393 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.394 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.394 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:03.652 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.652 "name": "Existed_Raid", 00:20:03.652 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:20:03.652 "strip_size_kb": 64, 00:20:03.652 "state": "configuring", 00:20:03.652 "raid_level": "concat", 00:20:03.652 "superblock": true, 00:20:03.652 "num_base_bdevs": 4, 00:20:03.652 "num_base_bdevs_discovered": 2, 00:20:03.652 "num_base_bdevs_operational": 4, 00:20:03.652 "base_bdevs_list": [ 00:20:03.652 { 00:20:03.652 "name": null, 00:20:03.652 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:20:03.652 "is_configured": false, 00:20:03.652 "data_offset": 2048, 00:20:03.652 "data_size": 63488 00:20:03.652 }, 00:20:03.652 { 00:20:03.652 "name": null, 00:20:03.652 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:20:03.652 "is_configured": false, 00:20:03.652 "data_offset": 2048, 00:20:03.652 "data_size": 63488 00:20:03.652 }, 00:20:03.652 { 00:20:03.652 "name": "BaseBdev3", 00:20:03.652 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:20:03.652 "is_configured": true, 00:20:03.652 "data_offset": 2048, 00:20:03.652 "data_size": 63488 00:20:03.652 }, 00:20:03.652 { 00:20:03.652 "name": "BaseBdev4", 00:20:03.652 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:20:03.652 "is_configured": true, 00:20:03.652 "data_offset": 2048, 00:20:03.652 "data_size": 63488 00:20:03.652 } 00:20:03.652 ] 00:20:03.652 }' 00:20:03.652 10:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.652 10:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:04.219 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.219 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:04.478 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:04.478 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:04.736 [2024-07-12 10:46:39.808904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.736 10:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:04.994 10:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.994 "name": "Existed_Raid", 00:20:04.994 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:20:04.994 "strip_size_kb": 64, 00:20:04.994 "state": "configuring", 00:20:04.994 "raid_level": "concat", 00:20:04.994 "superblock": true, 00:20:04.994 "num_base_bdevs": 4, 00:20:04.994 "num_base_bdevs_discovered": 3, 00:20:04.994 "num_base_bdevs_operational": 4, 00:20:04.994 "base_bdevs_list": [ 00:20:04.994 { 00:20:04.994 "name": null, 00:20:04.994 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:20:04.994 "is_configured": false, 00:20:04.994 "data_offset": 2048, 00:20:04.994 "data_size": 63488 00:20:04.994 }, 00:20:04.994 { 00:20:04.994 "name": "BaseBdev2", 00:20:04.994 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:20:04.994 "is_configured": true, 00:20:04.994 "data_offset": 2048, 00:20:04.994 "data_size": 63488 00:20:04.994 }, 00:20:04.994 { 00:20:04.994 "name": "BaseBdev3", 00:20:04.994 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:20:04.994 "is_configured": true, 00:20:04.994 "data_offset": 2048, 00:20:04.994 "data_size": 63488 00:20:04.994 }, 00:20:04.994 { 00:20:04.994 "name": "BaseBdev4", 00:20:04.994 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:20:04.994 "is_configured": true, 00:20:04.994 "data_offset": 2048, 00:20:04.994 "data_size": 63488 00:20:04.994 } 00:20:04.994 ] 00:20:04.994 }' 00:20:04.994 10:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.994 10:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:05.561 10:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.561 10:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:05.823 10:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:05.823 10:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.823 10:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:06.113 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7127c52b-4792-48a7-85f0-7108f46099a4 00:20:06.372 [2024-07-12 10:46:41.353568] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:06.372 [2024-07-12 10:46:41.353740] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a8d850 00:20:06.372 [2024-07-12 10:46:41.353755] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:06.372 [2024-07-12 10:46:41.353935] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a83d80 00:20:06.372 [2024-07-12 10:46:41.354051] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a8d850 00:20:06.372 [2024-07-12 10:46:41.354061] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a8d850 00:20:06.372 [2024-07-12 10:46:41.354153] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:06.372 NewBaseBdev 00:20:06.372 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:06.372 10:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:06.372 10:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:06.372 10:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:06.372 10:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:06.372 10:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:06.372 10:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:06.631 10:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:06.891 [ 00:20:06.891 { 00:20:06.891 "name": "NewBaseBdev", 00:20:06.891 "aliases": [ 00:20:06.891 "7127c52b-4792-48a7-85f0-7108f46099a4" 00:20:06.891 ], 00:20:06.891 "product_name": "Malloc disk", 00:20:06.891 "block_size": 512, 00:20:06.891 "num_blocks": 65536, 00:20:06.891 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:20:06.891 "assigned_rate_limits": { 00:20:06.891 "rw_ios_per_sec": 0, 00:20:06.891 "rw_mbytes_per_sec": 0, 00:20:06.891 "r_mbytes_per_sec": 0, 00:20:06.891 "w_mbytes_per_sec": 0 00:20:06.891 }, 00:20:06.891 "claimed": true, 00:20:06.891 "claim_type": "exclusive_write", 00:20:06.891 "zoned": false, 00:20:06.891 "supported_io_types": { 00:20:06.891 "read": true, 00:20:06.891 "write": true, 00:20:06.891 "unmap": true, 00:20:06.891 "flush": true, 00:20:06.891 "reset": true, 00:20:06.891 "nvme_admin": false, 00:20:06.891 "nvme_io": false, 00:20:06.891 "nvme_io_md": false, 00:20:06.891 "write_zeroes": true, 00:20:06.891 "zcopy": true, 00:20:06.891 "get_zone_info": false, 00:20:06.891 "zone_management": false, 00:20:06.891 "zone_append": false, 00:20:06.891 "compare": false, 00:20:06.891 "compare_and_write": false, 00:20:06.891 "abort": true, 00:20:06.891 "seek_hole": false, 00:20:06.891 "seek_data": false, 00:20:06.891 "copy": true, 00:20:06.891 "nvme_iov_md": false 00:20:06.891 }, 00:20:06.891 "memory_domains": [ 00:20:06.891 { 00:20:06.891 "dma_device_id": "system", 00:20:06.891 "dma_device_type": 1 00:20:06.891 }, 00:20:06.891 { 00:20:06.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:06.891 "dma_device_type": 2 00:20:06.891 } 00:20:06.891 ], 00:20:06.891 "driver_specific": {} 00:20:06.891 } 00:20:06.891 ] 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.891 10:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.151 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.151 "name": "Existed_Raid", 00:20:07.151 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:20:07.151 "strip_size_kb": 64, 00:20:07.151 "state": "online", 00:20:07.151 "raid_level": "concat", 00:20:07.151 "superblock": true, 00:20:07.151 "num_base_bdevs": 4, 00:20:07.151 "num_base_bdevs_discovered": 4, 00:20:07.151 "num_base_bdevs_operational": 4, 00:20:07.151 "base_bdevs_list": [ 00:20:07.151 { 00:20:07.151 "name": "NewBaseBdev", 00:20:07.151 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:20:07.151 "is_configured": true, 00:20:07.151 "data_offset": 2048, 00:20:07.151 "data_size": 63488 00:20:07.151 }, 00:20:07.151 { 00:20:07.151 "name": "BaseBdev2", 00:20:07.151 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:20:07.151 "is_configured": true, 00:20:07.151 "data_offset": 2048, 00:20:07.151 "data_size": 63488 00:20:07.151 }, 00:20:07.151 { 00:20:07.151 "name": "BaseBdev3", 00:20:07.151 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:20:07.151 "is_configured": true, 00:20:07.151 "data_offset": 2048, 00:20:07.151 "data_size": 63488 00:20:07.151 }, 00:20:07.151 { 00:20:07.151 "name": "BaseBdev4", 00:20:07.151 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:20:07.151 "is_configured": true, 00:20:07.151 "data_offset": 2048, 00:20:07.151 "data_size": 63488 00:20:07.151 } 00:20:07.151 ] 00:20:07.151 }' 00:20:07.151 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.151 10:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:07.717 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:07.717 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:07.717 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:07.717 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:07.717 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:07.717 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:07.717 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:07.717 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:07.717 [2024-07-12 10:46:42.910006] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:07.981 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:07.981 "name": "Existed_Raid", 00:20:07.981 "aliases": [ 00:20:07.981 "03c5f5c1-f192-42c8-817c-3c9556fc8814" 00:20:07.981 ], 00:20:07.981 "product_name": "Raid Volume", 00:20:07.981 "block_size": 512, 00:20:07.981 "num_blocks": 253952, 00:20:07.981 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:20:07.981 "assigned_rate_limits": { 00:20:07.981 "rw_ios_per_sec": 0, 00:20:07.981 "rw_mbytes_per_sec": 0, 00:20:07.981 "r_mbytes_per_sec": 0, 00:20:07.981 "w_mbytes_per_sec": 0 00:20:07.981 }, 00:20:07.981 "claimed": false, 00:20:07.981 "zoned": false, 00:20:07.981 "supported_io_types": { 00:20:07.981 "read": true, 00:20:07.981 "write": true, 00:20:07.981 "unmap": true, 00:20:07.981 "flush": true, 00:20:07.981 "reset": true, 00:20:07.981 "nvme_admin": false, 00:20:07.981 "nvme_io": false, 00:20:07.981 "nvme_io_md": false, 00:20:07.981 "write_zeroes": true, 00:20:07.981 "zcopy": false, 00:20:07.981 "get_zone_info": false, 00:20:07.981 "zone_management": false, 00:20:07.981 "zone_append": false, 00:20:07.981 "compare": false, 00:20:07.981 "compare_and_write": false, 00:20:07.981 "abort": false, 00:20:07.981 "seek_hole": false, 00:20:07.981 "seek_data": false, 00:20:07.981 "copy": false, 00:20:07.981 "nvme_iov_md": false 00:20:07.981 }, 00:20:07.981 "memory_domains": [ 00:20:07.981 { 00:20:07.981 "dma_device_id": "system", 00:20:07.981 "dma_device_type": 1 00:20:07.981 }, 00:20:07.981 { 00:20:07.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.981 "dma_device_type": 2 00:20:07.981 }, 00:20:07.981 { 00:20:07.981 "dma_device_id": "system", 00:20:07.981 "dma_device_type": 1 00:20:07.981 }, 00:20:07.981 { 00:20:07.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.981 "dma_device_type": 2 00:20:07.981 }, 00:20:07.981 { 00:20:07.981 "dma_device_id": "system", 00:20:07.981 "dma_device_type": 1 00:20:07.981 }, 00:20:07.981 { 00:20:07.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.981 "dma_device_type": 2 00:20:07.981 }, 00:20:07.981 { 00:20:07.981 "dma_device_id": "system", 00:20:07.981 "dma_device_type": 1 00:20:07.981 }, 00:20:07.981 { 00:20:07.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.981 "dma_device_type": 2 00:20:07.981 } 00:20:07.981 ], 00:20:07.981 "driver_specific": { 00:20:07.981 "raid": { 00:20:07.981 "uuid": "03c5f5c1-f192-42c8-817c-3c9556fc8814", 00:20:07.981 "strip_size_kb": 64, 00:20:07.981 "state": "online", 00:20:07.981 "raid_level": "concat", 00:20:07.981 "superblock": true, 00:20:07.981 "num_base_bdevs": 4, 00:20:07.981 "num_base_bdevs_discovered": 4, 00:20:07.981 "num_base_bdevs_operational": 4, 00:20:07.981 "base_bdevs_list": [ 00:20:07.981 { 00:20:07.981 "name": "NewBaseBdev", 00:20:07.981 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:20:07.981 "is_configured": true, 00:20:07.981 "data_offset": 2048, 00:20:07.981 "data_size": 63488 00:20:07.981 }, 00:20:07.981 { 00:20:07.981 "name": "BaseBdev2", 00:20:07.981 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:20:07.981 "is_configured": true, 00:20:07.981 "data_offset": 2048, 00:20:07.981 "data_size": 63488 00:20:07.981 }, 00:20:07.981 { 00:20:07.981 "name": "BaseBdev3", 00:20:07.981 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:20:07.982 "is_configured": true, 00:20:07.982 "data_offset": 2048, 00:20:07.982 "data_size": 63488 00:20:07.982 }, 00:20:07.982 { 00:20:07.982 "name": "BaseBdev4", 00:20:07.982 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:20:07.982 "is_configured": true, 00:20:07.982 "data_offset": 2048, 00:20:07.982 "data_size": 63488 00:20:07.982 } 00:20:07.982 ] 00:20:07.982 } 00:20:07.982 } 00:20:07.982 }' 00:20:07.982 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:07.982 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:07.982 BaseBdev2 00:20:07.982 BaseBdev3 00:20:07.982 BaseBdev4' 00:20:07.982 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:07.982 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:07.982 10:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:08.243 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:08.243 "name": "NewBaseBdev", 00:20:08.243 "aliases": [ 00:20:08.243 "7127c52b-4792-48a7-85f0-7108f46099a4" 00:20:08.243 ], 00:20:08.243 "product_name": "Malloc disk", 00:20:08.243 "block_size": 512, 00:20:08.243 "num_blocks": 65536, 00:20:08.243 "uuid": "7127c52b-4792-48a7-85f0-7108f46099a4", 00:20:08.243 "assigned_rate_limits": { 00:20:08.243 "rw_ios_per_sec": 0, 00:20:08.243 "rw_mbytes_per_sec": 0, 00:20:08.243 "r_mbytes_per_sec": 0, 00:20:08.243 "w_mbytes_per_sec": 0 00:20:08.243 }, 00:20:08.243 "claimed": true, 00:20:08.243 "claim_type": "exclusive_write", 00:20:08.243 "zoned": false, 00:20:08.243 "supported_io_types": { 00:20:08.243 "read": true, 00:20:08.243 "write": true, 00:20:08.243 "unmap": true, 00:20:08.243 "flush": true, 00:20:08.243 "reset": true, 00:20:08.243 "nvme_admin": false, 00:20:08.243 "nvme_io": false, 00:20:08.243 "nvme_io_md": false, 00:20:08.243 "write_zeroes": true, 00:20:08.243 "zcopy": true, 00:20:08.243 "get_zone_info": false, 00:20:08.243 "zone_management": false, 00:20:08.243 "zone_append": false, 00:20:08.243 "compare": false, 00:20:08.243 "compare_and_write": false, 00:20:08.243 "abort": true, 00:20:08.243 "seek_hole": false, 00:20:08.243 "seek_data": false, 00:20:08.243 "copy": true, 00:20:08.243 "nvme_iov_md": false 00:20:08.243 }, 00:20:08.243 "memory_domains": [ 00:20:08.243 { 00:20:08.243 "dma_device_id": "system", 00:20:08.243 "dma_device_type": 1 00:20:08.243 }, 00:20:08.243 { 00:20:08.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.243 "dma_device_type": 2 00:20:08.243 } 00:20:08.243 ], 00:20:08.243 "driver_specific": {} 00:20:08.243 }' 00:20:08.243 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:08.243 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:08.243 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:08.243 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:08.243 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:08.243 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:08.243 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:08.500 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:08.500 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:08.500 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:08.500 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:08.500 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:08.500 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:08.500 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:08.500 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:08.757 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:08.758 "name": "BaseBdev2", 00:20:08.758 "aliases": [ 00:20:08.758 "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14" 00:20:08.758 ], 00:20:08.758 "product_name": "Malloc disk", 00:20:08.758 "block_size": 512, 00:20:08.758 "num_blocks": 65536, 00:20:08.758 "uuid": "4d61b0fd-61b3-4dcb-a159-8dc8aa676e14", 00:20:08.758 "assigned_rate_limits": { 00:20:08.758 "rw_ios_per_sec": 0, 00:20:08.758 "rw_mbytes_per_sec": 0, 00:20:08.758 "r_mbytes_per_sec": 0, 00:20:08.758 "w_mbytes_per_sec": 0 00:20:08.758 }, 00:20:08.758 "claimed": true, 00:20:08.758 "claim_type": "exclusive_write", 00:20:08.758 "zoned": false, 00:20:08.758 "supported_io_types": { 00:20:08.758 "read": true, 00:20:08.758 "write": true, 00:20:08.758 "unmap": true, 00:20:08.758 "flush": true, 00:20:08.758 "reset": true, 00:20:08.758 "nvme_admin": false, 00:20:08.758 "nvme_io": false, 00:20:08.758 "nvme_io_md": false, 00:20:08.758 "write_zeroes": true, 00:20:08.758 "zcopy": true, 00:20:08.758 "get_zone_info": false, 00:20:08.758 "zone_management": false, 00:20:08.758 "zone_append": false, 00:20:08.758 "compare": false, 00:20:08.758 "compare_and_write": false, 00:20:08.758 "abort": true, 00:20:08.758 "seek_hole": false, 00:20:08.758 "seek_data": false, 00:20:08.758 "copy": true, 00:20:08.758 "nvme_iov_md": false 00:20:08.758 }, 00:20:08.758 "memory_domains": [ 00:20:08.758 { 00:20:08.758 "dma_device_id": "system", 00:20:08.758 "dma_device_type": 1 00:20:08.758 }, 00:20:08.758 { 00:20:08.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.758 "dma_device_type": 2 00:20:08.758 } 00:20:08.758 ], 00:20:08.758 "driver_specific": {} 00:20:08.758 }' 00:20:08.758 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:08.758 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:08.758 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:08.758 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.015 10:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.015 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:09.015 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.016 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.016 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:09.016 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.016 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.016 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:09.016 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:09.016 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:09.016 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:09.274 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:09.274 "name": "BaseBdev3", 00:20:09.274 "aliases": [ 00:20:09.274 "4228a065-55d7-4bb8-8271-81bb733a948e" 00:20:09.274 ], 00:20:09.274 "product_name": "Malloc disk", 00:20:09.274 "block_size": 512, 00:20:09.274 "num_blocks": 65536, 00:20:09.274 "uuid": "4228a065-55d7-4bb8-8271-81bb733a948e", 00:20:09.274 "assigned_rate_limits": { 00:20:09.274 "rw_ios_per_sec": 0, 00:20:09.274 "rw_mbytes_per_sec": 0, 00:20:09.274 "r_mbytes_per_sec": 0, 00:20:09.274 "w_mbytes_per_sec": 0 00:20:09.274 }, 00:20:09.274 "claimed": true, 00:20:09.274 "claim_type": "exclusive_write", 00:20:09.274 "zoned": false, 00:20:09.274 "supported_io_types": { 00:20:09.274 "read": true, 00:20:09.274 "write": true, 00:20:09.274 "unmap": true, 00:20:09.274 "flush": true, 00:20:09.274 "reset": true, 00:20:09.274 "nvme_admin": false, 00:20:09.274 "nvme_io": false, 00:20:09.274 "nvme_io_md": false, 00:20:09.274 "write_zeroes": true, 00:20:09.274 "zcopy": true, 00:20:09.274 "get_zone_info": false, 00:20:09.274 "zone_management": false, 00:20:09.274 "zone_append": false, 00:20:09.274 "compare": false, 00:20:09.274 "compare_and_write": false, 00:20:09.274 "abort": true, 00:20:09.274 "seek_hole": false, 00:20:09.274 "seek_data": false, 00:20:09.274 "copy": true, 00:20:09.274 "nvme_iov_md": false 00:20:09.274 }, 00:20:09.274 "memory_domains": [ 00:20:09.274 { 00:20:09.274 "dma_device_id": "system", 00:20:09.274 "dma_device_type": 1 00:20:09.274 }, 00:20:09.274 { 00:20:09.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.274 "dma_device_type": 2 00:20:09.274 } 00:20:09.274 ], 00:20:09.274 "driver_specific": {} 00:20:09.274 }' 00:20:09.274 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.532 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.790 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:09.790 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:09.790 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:09.790 10:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:10.048 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:10.048 "name": "BaseBdev4", 00:20:10.048 "aliases": [ 00:20:10.048 "06335a06-f8f1-466f-9fbd-db19f9d12c05" 00:20:10.048 ], 00:20:10.048 "product_name": "Malloc disk", 00:20:10.048 "block_size": 512, 00:20:10.048 "num_blocks": 65536, 00:20:10.048 "uuid": "06335a06-f8f1-466f-9fbd-db19f9d12c05", 00:20:10.048 "assigned_rate_limits": { 00:20:10.048 "rw_ios_per_sec": 0, 00:20:10.048 "rw_mbytes_per_sec": 0, 00:20:10.048 "r_mbytes_per_sec": 0, 00:20:10.048 "w_mbytes_per_sec": 0 00:20:10.048 }, 00:20:10.048 "claimed": true, 00:20:10.048 "claim_type": "exclusive_write", 00:20:10.048 "zoned": false, 00:20:10.048 "supported_io_types": { 00:20:10.048 "read": true, 00:20:10.048 "write": true, 00:20:10.048 "unmap": true, 00:20:10.048 "flush": true, 00:20:10.048 "reset": true, 00:20:10.048 "nvme_admin": false, 00:20:10.048 "nvme_io": false, 00:20:10.048 "nvme_io_md": false, 00:20:10.048 "write_zeroes": true, 00:20:10.048 "zcopy": true, 00:20:10.048 "get_zone_info": false, 00:20:10.048 "zone_management": false, 00:20:10.048 "zone_append": false, 00:20:10.048 "compare": false, 00:20:10.048 "compare_and_write": false, 00:20:10.048 "abort": true, 00:20:10.048 "seek_hole": false, 00:20:10.048 "seek_data": false, 00:20:10.048 "copy": true, 00:20:10.048 "nvme_iov_md": false 00:20:10.048 }, 00:20:10.048 "memory_domains": [ 00:20:10.048 { 00:20:10.048 "dma_device_id": "system", 00:20:10.048 "dma_device_type": 1 00:20:10.048 }, 00:20:10.048 { 00:20:10.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.048 "dma_device_type": 2 00:20:10.048 } 00:20:10.048 ], 00:20:10.048 "driver_specific": {} 00:20:10.048 }' 00:20:10.048 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.048 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.048 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:10.048 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.048 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.048 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:10.048 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.048 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.307 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:10.307 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.307 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.307 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:10.307 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:10.565 [2024-07-12 10:46:45.568760] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:10.565 [2024-07-12 10:46:45.568794] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:10.565 [2024-07-12 10:46:45.568856] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:10.565 [2024-07-12 10:46:45.568922] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:10.565 [2024-07-12 10:46:45.568935] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a8d850 name Existed_Raid, state offline 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2095367 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2095367 ']' 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2095367 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2095367 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2095367' 00:20:10.565 killing process with pid 2095367 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2095367 00:20:10.565 [2024-07-12 10:46:45.633457] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:10.565 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2095367 00:20:10.565 [2024-07-12 10:46:45.675773] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:10.825 10:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:10.825 00:20:10.825 real 0m32.089s 00:20:10.825 user 0m58.935s 00:20:10.825 sys 0m5.699s 00:20:10.825 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:10.825 10:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:10.825 ************************************ 00:20:10.825 END TEST raid_state_function_test_sb 00:20:10.825 ************************************ 00:20:10.825 10:46:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:10.825 10:46:45 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:20:10.825 10:46:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:10.825 10:46:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:10.825 10:46:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:10.825 ************************************ 00:20:10.825 START TEST raid_superblock_test 00:20:10.825 ************************************ 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2100248 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2100248 /var/tmp/spdk-raid.sock 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2100248 ']' 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:10.825 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:10.825 10:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:11.084 [2024-07-12 10:46:46.042590] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:20:11.084 [2024-07-12 10:46:46.042655] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2100248 ] 00:20:11.084 [2024-07-12 10:46:46.161850] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:11.084 [2024-07-12 10:46:46.264061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:11.343 [2024-07-12 10:46:46.328698] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:11.343 [2024-07-12 10:46:46.328752] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:12.280 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:12.539 malloc1 00:20:12.798 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:12.798 [2024-07-12 10:46:47.974818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:12.798 [2024-07-12 10:46:47.974869] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:12.798 [2024-07-12 10:46:47.974892] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20fa570 00:20:12.798 [2024-07-12 10:46:47.974906] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:12.798 [2024-07-12 10:46:47.976625] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:12.798 [2024-07-12 10:46:47.976657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:12.798 pt1 00:20:12.798 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:12.798 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:13.057 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:13.057 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:13.057 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:13.057 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:13.057 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:13.057 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:13.057 10:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:13.316 malloc2 00:20:13.316 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:13.576 [2024-07-12 10:46:48.733612] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:13.576 [2024-07-12 10:46:48.733662] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:13.576 [2024-07-12 10:46:48.733682] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20fb970 00:20:13.576 [2024-07-12 10:46:48.733695] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:13.576 [2024-07-12 10:46:48.735339] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:13.576 [2024-07-12 10:46:48.735369] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:13.576 pt2 00:20:13.576 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:13.576 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:13.576 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:13.576 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:13.576 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:13.576 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:13.576 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:13.576 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:13.576 10:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:14.144 malloc3 00:20:14.144 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:14.403 [2024-07-12 10:46:49.493476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:14.403 [2024-07-12 10:46:49.493533] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:14.403 [2024-07-12 10:46:49.493551] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2292340 00:20:14.403 [2024-07-12 10:46:49.493564] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:14.403 [2024-07-12 10:46:49.495141] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:14.403 [2024-07-12 10:46:49.495172] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:14.403 pt3 00:20:14.403 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:14.403 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:14.403 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:14.403 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:14.404 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:14.404 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:14.404 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:14.404 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:14.404 10:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:14.972 malloc4 00:20:14.972 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:15.231 [2024-07-12 10:46:50.195890] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:15.231 [2024-07-12 10:46:50.195940] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:15.231 [2024-07-12 10:46:50.195963] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2294c60 00:20:15.231 [2024-07-12 10:46:50.195976] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:15.231 [2024-07-12 10:46:50.197465] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:15.231 [2024-07-12 10:46:50.197501] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:15.231 pt4 00:20:15.231 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:15.231 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:15.231 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:15.490 [2024-07-12 10:46:50.448585] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:15.490 [2024-07-12 10:46:50.449937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:15.490 [2024-07-12 10:46:50.449992] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:15.490 [2024-07-12 10:46:50.450036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:15.490 [2024-07-12 10:46:50.450206] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20f2530 00:20:15.490 [2024-07-12 10:46:50.450218] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:15.490 [2024-07-12 10:46:50.450415] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f0770 00:20:15.490 [2024-07-12 10:46:50.450574] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20f2530 00:20:15.490 [2024-07-12 10:46:50.450584] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20f2530 00:20:15.490 [2024-07-12 10:46:50.450682] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.490 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:15.749 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.749 "name": "raid_bdev1", 00:20:15.749 "uuid": "53616ce3-9dd3-4018-9869-e60466ff237b", 00:20:15.749 "strip_size_kb": 64, 00:20:15.749 "state": "online", 00:20:15.749 "raid_level": "concat", 00:20:15.749 "superblock": true, 00:20:15.749 "num_base_bdevs": 4, 00:20:15.749 "num_base_bdevs_discovered": 4, 00:20:15.749 "num_base_bdevs_operational": 4, 00:20:15.749 "base_bdevs_list": [ 00:20:15.749 { 00:20:15.749 "name": "pt1", 00:20:15.749 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:15.749 "is_configured": true, 00:20:15.749 "data_offset": 2048, 00:20:15.749 "data_size": 63488 00:20:15.749 }, 00:20:15.749 { 00:20:15.749 "name": "pt2", 00:20:15.749 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:15.749 "is_configured": true, 00:20:15.749 "data_offset": 2048, 00:20:15.749 "data_size": 63488 00:20:15.749 }, 00:20:15.749 { 00:20:15.749 "name": "pt3", 00:20:15.749 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:15.749 "is_configured": true, 00:20:15.749 "data_offset": 2048, 00:20:15.749 "data_size": 63488 00:20:15.749 }, 00:20:15.749 { 00:20:15.749 "name": "pt4", 00:20:15.749 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:15.749 "is_configured": true, 00:20:15.749 "data_offset": 2048, 00:20:15.749 "data_size": 63488 00:20:15.749 } 00:20:15.749 ] 00:20:15.749 }' 00:20:15.749 10:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.749 10:46:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:16.316 [2024-07-12 10:46:51.455507] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:16.316 "name": "raid_bdev1", 00:20:16.316 "aliases": [ 00:20:16.316 "53616ce3-9dd3-4018-9869-e60466ff237b" 00:20:16.316 ], 00:20:16.316 "product_name": "Raid Volume", 00:20:16.316 "block_size": 512, 00:20:16.316 "num_blocks": 253952, 00:20:16.316 "uuid": "53616ce3-9dd3-4018-9869-e60466ff237b", 00:20:16.316 "assigned_rate_limits": { 00:20:16.316 "rw_ios_per_sec": 0, 00:20:16.316 "rw_mbytes_per_sec": 0, 00:20:16.316 "r_mbytes_per_sec": 0, 00:20:16.316 "w_mbytes_per_sec": 0 00:20:16.316 }, 00:20:16.316 "claimed": false, 00:20:16.316 "zoned": false, 00:20:16.316 "supported_io_types": { 00:20:16.316 "read": true, 00:20:16.316 "write": true, 00:20:16.316 "unmap": true, 00:20:16.316 "flush": true, 00:20:16.316 "reset": true, 00:20:16.316 "nvme_admin": false, 00:20:16.316 "nvme_io": false, 00:20:16.316 "nvme_io_md": false, 00:20:16.316 "write_zeroes": true, 00:20:16.316 "zcopy": false, 00:20:16.316 "get_zone_info": false, 00:20:16.316 "zone_management": false, 00:20:16.316 "zone_append": false, 00:20:16.316 "compare": false, 00:20:16.316 "compare_and_write": false, 00:20:16.316 "abort": false, 00:20:16.316 "seek_hole": false, 00:20:16.316 "seek_data": false, 00:20:16.316 "copy": false, 00:20:16.316 "nvme_iov_md": false 00:20:16.316 }, 00:20:16.316 "memory_domains": [ 00:20:16.316 { 00:20:16.316 "dma_device_id": "system", 00:20:16.316 "dma_device_type": 1 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.316 "dma_device_type": 2 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "dma_device_id": "system", 00:20:16.316 "dma_device_type": 1 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.316 "dma_device_type": 2 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "dma_device_id": "system", 00:20:16.316 "dma_device_type": 1 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.316 "dma_device_type": 2 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "dma_device_id": "system", 00:20:16.316 "dma_device_type": 1 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.316 "dma_device_type": 2 00:20:16.316 } 00:20:16.316 ], 00:20:16.316 "driver_specific": { 00:20:16.316 "raid": { 00:20:16.316 "uuid": "53616ce3-9dd3-4018-9869-e60466ff237b", 00:20:16.316 "strip_size_kb": 64, 00:20:16.316 "state": "online", 00:20:16.316 "raid_level": "concat", 00:20:16.316 "superblock": true, 00:20:16.316 "num_base_bdevs": 4, 00:20:16.316 "num_base_bdevs_discovered": 4, 00:20:16.316 "num_base_bdevs_operational": 4, 00:20:16.316 "base_bdevs_list": [ 00:20:16.316 { 00:20:16.316 "name": "pt1", 00:20:16.316 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:16.316 "is_configured": true, 00:20:16.316 "data_offset": 2048, 00:20:16.316 "data_size": 63488 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "name": "pt2", 00:20:16.316 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:16.316 "is_configured": true, 00:20:16.316 "data_offset": 2048, 00:20:16.316 "data_size": 63488 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "name": "pt3", 00:20:16.316 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:16.316 "is_configured": true, 00:20:16.316 "data_offset": 2048, 00:20:16.316 "data_size": 63488 00:20:16.316 }, 00:20:16.316 { 00:20:16.316 "name": "pt4", 00:20:16.316 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:16.316 "is_configured": true, 00:20:16.316 "data_offset": 2048, 00:20:16.316 "data_size": 63488 00:20:16.316 } 00:20:16.316 ] 00:20:16.316 } 00:20:16.316 } 00:20:16.316 }' 00:20:16.316 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:16.575 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:16.575 pt2 00:20:16.575 pt3 00:20:16.575 pt4' 00:20:16.575 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:16.575 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:16.575 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:16.834 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:16.834 "name": "pt1", 00:20:16.834 "aliases": [ 00:20:16.834 "00000000-0000-0000-0000-000000000001" 00:20:16.834 ], 00:20:16.834 "product_name": "passthru", 00:20:16.834 "block_size": 512, 00:20:16.834 "num_blocks": 65536, 00:20:16.834 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:16.834 "assigned_rate_limits": { 00:20:16.834 "rw_ios_per_sec": 0, 00:20:16.834 "rw_mbytes_per_sec": 0, 00:20:16.834 "r_mbytes_per_sec": 0, 00:20:16.834 "w_mbytes_per_sec": 0 00:20:16.834 }, 00:20:16.834 "claimed": true, 00:20:16.834 "claim_type": "exclusive_write", 00:20:16.834 "zoned": false, 00:20:16.834 "supported_io_types": { 00:20:16.834 "read": true, 00:20:16.834 "write": true, 00:20:16.834 "unmap": true, 00:20:16.834 "flush": true, 00:20:16.834 "reset": true, 00:20:16.834 "nvme_admin": false, 00:20:16.834 "nvme_io": false, 00:20:16.834 "nvme_io_md": false, 00:20:16.834 "write_zeroes": true, 00:20:16.834 "zcopy": true, 00:20:16.834 "get_zone_info": false, 00:20:16.834 "zone_management": false, 00:20:16.834 "zone_append": false, 00:20:16.834 "compare": false, 00:20:16.834 "compare_and_write": false, 00:20:16.834 "abort": true, 00:20:16.834 "seek_hole": false, 00:20:16.834 "seek_data": false, 00:20:16.834 "copy": true, 00:20:16.834 "nvme_iov_md": false 00:20:16.834 }, 00:20:16.834 "memory_domains": [ 00:20:16.834 { 00:20:16.834 "dma_device_id": "system", 00:20:16.834 "dma_device_type": 1 00:20:16.834 }, 00:20:16.834 { 00:20:16.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.834 "dma_device_type": 2 00:20:16.834 } 00:20:16.834 ], 00:20:16.834 "driver_specific": { 00:20:16.834 "passthru": { 00:20:16.834 "name": "pt1", 00:20:16.834 "base_bdev_name": "malloc1" 00:20:16.834 } 00:20:16.834 } 00:20:16.834 }' 00:20:16.834 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.834 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.834 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:16.834 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.834 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.834 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:16.834 10:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.834 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.093 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:17.093 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.093 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.093 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:17.093 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:17.093 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:17.093 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:17.353 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:17.353 "name": "pt2", 00:20:17.353 "aliases": [ 00:20:17.353 "00000000-0000-0000-0000-000000000002" 00:20:17.353 ], 00:20:17.353 "product_name": "passthru", 00:20:17.353 "block_size": 512, 00:20:17.353 "num_blocks": 65536, 00:20:17.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:17.353 "assigned_rate_limits": { 00:20:17.353 "rw_ios_per_sec": 0, 00:20:17.353 "rw_mbytes_per_sec": 0, 00:20:17.353 "r_mbytes_per_sec": 0, 00:20:17.353 "w_mbytes_per_sec": 0 00:20:17.353 }, 00:20:17.353 "claimed": true, 00:20:17.353 "claim_type": "exclusive_write", 00:20:17.353 "zoned": false, 00:20:17.353 "supported_io_types": { 00:20:17.353 "read": true, 00:20:17.353 "write": true, 00:20:17.353 "unmap": true, 00:20:17.353 "flush": true, 00:20:17.353 "reset": true, 00:20:17.353 "nvme_admin": false, 00:20:17.353 "nvme_io": false, 00:20:17.353 "nvme_io_md": false, 00:20:17.353 "write_zeroes": true, 00:20:17.353 "zcopy": true, 00:20:17.353 "get_zone_info": false, 00:20:17.353 "zone_management": false, 00:20:17.353 "zone_append": false, 00:20:17.353 "compare": false, 00:20:17.353 "compare_and_write": false, 00:20:17.353 "abort": true, 00:20:17.353 "seek_hole": false, 00:20:17.353 "seek_data": false, 00:20:17.353 "copy": true, 00:20:17.353 "nvme_iov_md": false 00:20:17.353 }, 00:20:17.353 "memory_domains": [ 00:20:17.353 { 00:20:17.353 "dma_device_id": "system", 00:20:17.353 "dma_device_type": 1 00:20:17.353 }, 00:20:17.353 { 00:20:17.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.353 "dma_device_type": 2 00:20:17.353 } 00:20:17.353 ], 00:20:17.353 "driver_specific": { 00:20:17.353 "passthru": { 00:20:17.353 "name": "pt2", 00:20:17.353 "base_bdev_name": "malloc2" 00:20:17.353 } 00:20:17.353 } 00:20:17.353 }' 00:20:17.353 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.353 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.353 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:17.353 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.353 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:17.612 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:17.872 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:17.872 "name": "pt3", 00:20:17.872 "aliases": [ 00:20:17.872 "00000000-0000-0000-0000-000000000003" 00:20:17.872 ], 00:20:17.872 "product_name": "passthru", 00:20:17.872 "block_size": 512, 00:20:17.872 "num_blocks": 65536, 00:20:17.872 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:17.872 "assigned_rate_limits": { 00:20:17.872 "rw_ios_per_sec": 0, 00:20:17.872 "rw_mbytes_per_sec": 0, 00:20:17.872 "r_mbytes_per_sec": 0, 00:20:17.872 "w_mbytes_per_sec": 0 00:20:17.872 }, 00:20:17.872 "claimed": true, 00:20:17.872 "claim_type": "exclusive_write", 00:20:17.872 "zoned": false, 00:20:17.872 "supported_io_types": { 00:20:17.872 "read": true, 00:20:17.872 "write": true, 00:20:17.872 "unmap": true, 00:20:17.872 "flush": true, 00:20:17.872 "reset": true, 00:20:17.872 "nvme_admin": false, 00:20:17.872 "nvme_io": false, 00:20:17.872 "nvme_io_md": false, 00:20:17.872 "write_zeroes": true, 00:20:17.872 "zcopy": true, 00:20:17.872 "get_zone_info": false, 00:20:17.872 "zone_management": false, 00:20:17.872 "zone_append": false, 00:20:17.872 "compare": false, 00:20:17.872 "compare_and_write": false, 00:20:17.872 "abort": true, 00:20:17.872 "seek_hole": false, 00:20:17.872 "seek_data": false, 00:20:17.872 "copy": true, 00:20:17.872 "nvme_iov_md": false 00:20:17.872 }, 00:20:17.872 "memory_domains": [ 00:20:17.872 { 00:20:17.872 "dma_device_id": "system", 00:20:17.872 "dma_device_type": 1 00:20:17.872 }, 00:20:17.872 { 00:20:17.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.872 "dma_device_type": 2 00:20:17.872 } 00:20:17.872 ], 00:20:17.872 "driver_specific": { 00:20:17.872 "passthru": { 00:20:17.872 "name": "pt3", 00:20:17.872 "base_bdev_name": "malloc3" 00:20:17.872 } 00:20:17.872 } 00:20:17.872 }' 00:20:17.872 10:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.872 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.130 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:18.130 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.130 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.130 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:18.130 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.130 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.130 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:18.130 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.130 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.387 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:18.387 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:18.387 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:18.387 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:18.387 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:18.387 "name": "pt4", 00:20:18.387 "aliases": [ 00:20:18.387 "00000000-0000-0000-0000-000000000004" 00:20:18.387 ], 00:20:18.387 "product_name": "passthru", 00:20:18.387 "block_size": 512, 00:20:18.387 "num_blocks": 65536, 00:20:18.387 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:18.387 "assigned_rate_limits": { 00:20:18.387 "rw_ios_per_sec": 0, 00:20:18.387 "rw_mbytes_per_sec": 0, 00:20:18.387 "r_mbytes_per_sec": 0, 00:20:18.387 "w_mbytes_per_sec": 0 00:20:18.387 }, 00:20:18.387 "claimed": true, 00:20:18.387 "claim_type": "exclusive_write", 00:20:18.387 "zoned": false, 00:20:18.387 "supported_io_types": { 00:20:18.387 "read": true, 00:20:18.387 "write": true, 00:20:18.387 "unmap": true, 00:20:18.387 "flush": true, 00:20:18.387 "reset": true, 00:20:18.387 "nvme_admin": false, 00:20:18.387 "nvme_io": false, 00:20:18.387 "nvme_io_md": false, 00:20:18.387 "write_zeroes": true, 00:20:18.387 "zcopy": true, 00:20:18.387 "get_zone_info": false, 00:20:18.387 "zone_management": false, 00:20:18.387 "zone_append": false, 00:20:18.387 "compare": false, 00:20:18.387 "compare_and_write": false, 00:20:18.387 "abort": true, 00:20:18.387 "seek_hole": false, 00:20:18.387 "seek_data": false, 00:20:18.387 "copy": true, 00:20:18.387 "nvme_iov_md": false 00:20:18.387 }, 00:20:18.387 "memory_domains": [ 00:20:18.387 { 00:20:18.387 "dma_device_id": "system", 00:20:18.387 "dma_device_type": 1 00:20:18.387 }, 00:20:18.387 { 00:20:18.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:18.387 "dma_device_type": 2 00:20:18.387 } 00:20:18.387 ], 00:20:18.387 "driver_specific": { 00:20:18.387 "passthru": { 00:20:18.387 "name": "pt4", 00:20:18.387 "base_bdev_name": "malloc4" 00:20:18.387 } 00:20:18.387 } 00:20:18.387 }' 00:20:18.387 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.387 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.644 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:18.644 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.644 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.644 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:18.644 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.644 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.644 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:18.644 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.644 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.903 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:18.903 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:18.903 10:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:18.903 [2024-07-12 10:46:54.030298] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:18.903 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=53616ce3-9dd3-4018-9869-e60466ff237b 00:20:18.903 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 53616ce3-9dd3-4018-9869-e60466ff237b ']' 00:20:18.903 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:19.160 [2024-07-12 10:46:54.278674] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:19.160 [2024-07-12 10:46:54.278698] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:19.160 [2024-07-12 10:46:54.278748] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:19.160 [2024-07-12 10:46:54.278812] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:19.160 [2024-07-12 10:46:54.278824] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f2530 name raid_bdev1, state offline 00:20:19.161 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.161 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:19.419 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:19.419 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:19.419 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:19.419 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:19.678 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:19.678 10:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:19.949 10:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:19.949 10:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:20.222 10:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:20.222 10:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:20.480 10:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:20.480 10:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:20.739 10:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:20.997 [2024-07-12 10:46:55.987197] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:20.997 [2024-07-12 10:46:55.988526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:20.997 [2024-07-12 10:46:55.988569] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:20.997 [2024-07-12 10:46:55.988603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:20.997 [2024-07-12 10:46:55.988651] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:20.997 [2024-07-12 10:46:55.988689] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:20.997 [2024-07-12 10:46:55.988712] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:20.997 [2024-07-12 10:46:55.988734] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:20.997 [2024-07-12 10:46:55.988752] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:20.997 [2024-07-12 10:46:55.988762] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x229dff0 name raid_bdev1, state configuring 00:20:20.997 request: 00:20:20.997 { 00:20:20.997 "name": "raid_bdev1", 00:20:20.997 "raid_level": "concat", 00:20:20.997 "base_bdevs": [ 00:20:20.997 "malloc1", 00:20:20.997 "malloc2", 00:20:20.997 "malloc3", 00:20:20.997 "malloc4" 00:20:20.997 ], 00:20:20.997 "strip_size_kb": 64, 00:20:20.997 "superblock": false, 00:20:20.997 "method": "bdev_raid_create", 00:20:20.997 "req_id": 1 00:20:20.997 } 00:20:20.997 Got JSON-RPC error response 00:20:20.997 response: 00:20:20.997 { 00:20:20.997 "code": -17, 00:20:20.997 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:20.997 } 00:20:20.997 10:46:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:20.997 10:46:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:20.997 10:46:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:20.997 10:46:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:20.997 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.997 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:21.256 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:21.256 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:21.256 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:21.514 [2024-07-12 10:46:56.464396] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:21.514 [2024-07-12 10:46:56.464447] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:21.514 [2024-07-12 10:46:56.464472] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20fa7a0 00:20:21.514 [2024-07-12 10:46:56.464496] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:21.514 [2024-07-12 10:46:56.466146] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:21.514 [2024-07-12 10:46:56.466176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:21.514 [2024-07-12 10:46:56.466246] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:21.514 [2024-07-12 10:46:56.466272] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:21.514 pt1 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.514 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:21.772 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.772 "name": "raid_bdev1", 00:20:21.772 "uuid": "53616ce3-9dd3-4018-9869-e60466ff237b", 00:20:21.772 "strip_size_kb": 64, 00:20:21.772 "state": "configuring", 00:20:21.772 "raid_level": "concat", 00:20:21.772 "superblock": true, 00:20:21.772 "num_base_bdevs": 4, 00:20:21.772 "num_base_bdevs_discovered": 1, 00:20:21.772 "num_base_bdevs_operational": 4, 00:20:21.772 "base_bdevs_list": [ 00:20:21.772 { 00:20:21.772 "name": "pt1", 00:20:21.772 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:21.772 "is_configured": true, 00:20:21.772 "data_offset": 2048, 00:20:21.772 "data_size": 63488 00:20:21.772 }, 00:20:21.772 { 00:20:21.772 "name": null, 00:20:21.772 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:21.772 "is_configured": false, 00:20:21.772 "data_offset": 2048, 00:20:21.772 "data_size": 63488 00:20:21.772 }, 00:20:21.772 { 00:20:21.772 "name": null, 00:20:21.772 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:21.772 "is_configured": false, 00:20:21.772 "data_offset": 2048, 00:20:21.772 "data_size": 63488 00:20:21.772 }, 00:20:21.772 { 00:20:21.772 "name": null, 00:20:21.772 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:21.772 "is_configured": false, 00:20:21.772 "data_offset": 2048, 00:20:21.772 "data_size": 63488 00:20:21.772 } 00:20:21.772 ] 00:20:21.772 }' 00:20:21.772 10:46:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.772 10:46:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:22.337 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:22.337 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:22.594 [2024-07-12 10:46:57.563327] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:22.594 [2024-07-12 10:46:57.563381] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:22.594 [2024-07-12 10:46:57.563401] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f1ea0 00:20:22.594 [2024-07-12 10:46:57.563413] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:22.594 [2024-07-12 10:46:57.563764] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:22.594 [2024-07-12 10:46:57.563787] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:22.594 [2024-07-12 10:46:57.563853] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:22.594 [2024-07-12 10:46:57.563874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:22.594 pt2 00:20:22.594 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:22.851 [2024-07-12 10:46:57.811994] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.851 10:46:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:23.108 10:46:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:23.108 "name": "raid_bdev1", 00:20:23.108 "uuid": "53616ce3-9dd3-4018-9869-e60466ff237b", 00:20:23.108 "strip_size_kb": 64, 00:20:23.108 "state": "configuring", 00:20:23.108 "raid_level": "concat", 00:20:23.108 "superblock": true, 00:20:23.108 "num_base_bdevs": 4, 00:20:23.108 "num_base_bdevs_discovered": 1, 00:20:23.108 "num_base_bdevs_operational": 4, 00:20:23.108 "base_bdevs_list": [ 00:20:23.108 { 00:20:23.108 "name": "pt1", 00:20:23.108 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:23.108 "is_configured": true, 00:20:23.108 "data_offset": 2048, 00:20:23.108 "data_size": 63488 00:20:23.108 }, 00:20:23.108 { 00:20:23.108 "name": null, 00:20:23.108 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:23.108 "is_configured": false, 00:20:23.108 "data_offset": 2048, 00:20:23.108 "data_size": 63488 00:20:23.108 }, 00:20:23.108 { 00:20:23.108 "name": null, 00:20:23.108 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:23.108 "is_configured": false, 00:20:23.108 "data_offset": 2048, 00:20:23.108 "data_size": 63488 00:20:23.108 }, 00:20:23.108 { 00:20:23.108 "name": null, 00:20:23.108 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:23.108 "is_configured": false, 00:20:23.108 "data_offset": 2048, 00:20:23.108 "data_size": 63488 00:20:23.108 } 00:20:23.108 ] 00:20:23.108 }' 00:20:23.108 10:46:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:23.108 10:46:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.672 10:46:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:23.672 10:46:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:23.672 10:46:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:23.929 [2024-07-12 10:46:58.894943] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:23.929 [2024-07-12 10:46:58.894995] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:23.929 [2024-07-12 10:46:58.895015] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f0ec0 00:20:23.929 [2024-07-12 10:46:58.895029] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:23.929 [2024-07-12 10:46:58.895359] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:23.929 [2024-07-12 10:46:58.895380] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:23.929 [2024-07-12 10:46:58.895440] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:23.929 [2024-07-12 10:46:58.895458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:23.929 pt2 00:20:23.929 10:46:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:23.929 10:46:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:23.929 10:46:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:24.185 [2024-07-12 10:46:59.143616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:24.185 [2024-07-12 10:46:59.143658] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:24.185 [2024-07-12 10:46:59.143677] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f10f0 00:20:24.185 [2024-07-12 10:46:59.143690] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:24.185 [2024-07-12 10:46:59.144011] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:24.185 [2024-07-12 10:46:59.144029] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:24.185 [2024-07-12 10:46:59.144088] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:24.185 [2024-07-12 10:46:59.144106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:24.186 pt3 00:20:24.186 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:24.186 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:24.186 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:24.442 [2024-07-12 10:46:59.384244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:24.442 [2024-07-12 10:46:59.384282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:24.442 [2024-07-12 10:46:59.384300] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f9af0 00:20:24.442 [2024-07-12 10:46:59.384312] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:24.442 [2024-07-12 10:46:59.384621] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:24.442 [2024-07-12 10:46:59.384639] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:24.442 [2024-07-12 10:46:59.384694] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:24.442 [2024-07-12 10:46:59.384713] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:24.442 [2024-07-12 10:46:59.384832] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20f38f0 00:20:24.442 [2024-07-12 10:46:59.384843] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:24.442 [2024-07-12 10:46:59.385009] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f3150 00:20:24.442 [2024-07-12 10:46:59.385135] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20f38f0 00:20:24.442 [2024-07-12 10:46:59.385145] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20f38f0 00:20:24.442 [2024-07-12 10:46:59.385241] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:24.442 pt4 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.442 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.700 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.700 "name": "raid_bdev1", 00:20:24.700 "uuid": "53616ce3-9dd3-4018-9869-e60466ff237b", 00:20:24.700 "strip_size_kb": 64, 00:20:24.700 "state": "online", 00:20:24.700 "raid_level": "concat", 00:20:24.700 "superblock": true, 00:20:24.700 "num_base_bdevs": 4, 00:20:24.700 "num_base_bdevs_discovered": 4, 00:20:24.700 "num_base_bdevs_operational": 4, 00:20:24.700 "base_bdevs_list": [ 00:20:24.700 { 00:20:24.700 "name": "pt1", 00:20:24.700 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:24.700 "is_configured": true, 00:20:24.700 "data_offset": 2048, 00:20:24.700 "data_size": 63488 00:20:24.700 }, 00:20:24.700 { 00:20:24.700 "name": "pt2", 00:20:24.700 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:24.700 "is_configured": true, 00:20:24.700 "data_offset": 2048, 00:20:24.700 "data_size": 63488 00:20:24.700 }, 00:20:24.700 { 00:20:24.700 "name": "pt3", 00:20:24.700 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:24.700 "is_configured": true, 00:20:24.700 "data_offset": 2048, 00:20:24.700 "data_size": 63488 00:20:24.700 }, 00:20:24.700 { 00:20:24.700 "name": "pt4", 00:20:24.700 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:24.700 "is_configured": true, 00:20:24.700 "data_offset": 2048, 00:20:24.700 "data_size": 63488 00:20:24.700 } 00:20:24.700 ] 00:20:24.700 }' 00:20:24.700 10:46:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.700 10:46:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:25.268 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:25.268 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:25.268 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:25.268 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:25.268 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:25.268 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:25.268 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:25.268 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:25.268 [2024-07-12 10:47:00.423324] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:25.268 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:25.268 "name": "raid_bdev1", 00:20:25.268 "aliases": [ 00:20:25.268 "53616ce3-9dd3-4018-9869-e60466ff237b" 00:20:25.268 ], 00:20:25.268 "product_name": "Raid Volume", 00:20:25.268 "block_size": 512, 00:20:25.268 "num_blocks": 253952, 00:20:25.268 "uuid": "53616ce3-9dd3-4018-9869-e60466ff237b", 00:20:25.268 "assigned_rate_limits": { 00:20:25.268 "rw_ios_per_sec": 0, 00:20:25.268 "rw_mbytes_per_sec": 0, 00:20:25.268 "r_mbytes_per_sec": 0, 00:20:25.268 "w_mbytes_per_sec": 0 00:20:25.268 }, 00:20:25.268 "claimed": false, 00:20:25.268 "zoned": false, 00:20:25.268 "supported_io_types": { 00:20:25.268 "read": true, 00:20:25.268 "write": true, 00:20:25.268 "unmap": true, 00:20:25.268 "flush": true, 00:20:25.268 "reset": true, 00:20:25.268 "nvme_admin": false, 00:20:25.268 "nvme_io": false, 00:20:25.268 "nvme_io_md": false, 00:20:25.268 "write_zeroes": true, 00:20:25.268 "zcopy": false, 00:20:25.268 "get_zone_info": false, 00:20:25.268 "zone_management": false, 00:20:25.268 "zone_append": false, 00:20:25.268 "compare": false, 00:20:25.268 "compare_and_write": false, 00:20:25.268 "abort": false, 00:20:25.268 "seek_hole": false, 00:20:25.268 "seek_data": false, 00:20:25.268 "copy": false, 00:20:25.268 "nvme_iov_md": false 00:20:25.268 }, 00:20:25.268 "memory_domains": [ 00:20:25.268 { 00:20:25.268 "dma_device_id": "system", 00:20:25.268 "dma_device_type": 1 00:20:25.268 }, 00:20:25.268 { 00:20:25.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:25.268 "dma_device_type": 2 00:20:25.268 }, 00:20:25.268 { 00:20:25.268 "dma_device_id": "system", 00:20:25.268 "dma_device_type": 1 00:20:25.268 }, 00:20:25.268 { 00:20:25.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:25.268 "dma_device_type": 2 00:20:25.268 }, 00:20:25.268 { 00:20:25.268 "dma_device_id": "system", 00:20:25.268 "dma_device_type": 1 00:20:25.268 }, 00:20:25.268 { 00:20:25.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:25.268 "dma_device_type": 2 00:20:25.268 }, 00:20:25.268 { 00:20:25.268 "dma_device_id": "system", 00:20:25.268 "dma_device_type": 1 00:20:25.268 }, 00:20:25.268 { 00:20:25.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:25.268 "dma_device_type": 2 00:20:25.268 } 00:20:25.268 ], 00:20:25.268 "driver_specific": { 00:20:25.268 "raid": { 00:20:25.268 "uuid": "53616ce3-9dd3-4018-9869-e60466ff237b", 00:20:25.268 "strip_size_kb": 64, 00:20:25.268 "state": "online", 00:20:25.268 "raid_level": "concat", 00:20:25.268 "superblock": true, 00:20:25.268 "num_base_bdevs": 4, 00:20:25.269 "num_base_bdevs_discovered": 4, 00:20:25.269 "num_base_bdevs_operational": 4, 00:20:25.269 "base_bdevs_list": [ 00:20:25.269 { 00:20:25.269 "name": "pt1", 00:20:25.269 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:25.269 "is_configured": true, 00:20:25.269 "data_offset": 2048, 00:20:25.269 "data_size": 63488 00:20:25.269 }, 00:20:25.269 { 00:20:25.269 "name": "pt2", 00:20:25.269 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:25.269 "is_configured": true, 00:20:25.269 "data_offset": 2048, 00:20:25.269 "data_size": 63488 00:20:25.269 }, 00:20:25.269 { 00:20:25.269 "name": "pt3", 00:20:25.269 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:25.269 "is_configured": true, 00:20:25.269 "data_offset": 2048, 00:20:25.269 "data_size": 63488 00:20:25.269 }, 00:20:25.269 { 00:20:25.269 "name": "pt4", 00:20:25.269 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:25.269 "is_configured": true, 00:20:25.269 "data_offset": 2048, 00:20:25.269 "data_size": 63488 00:20:25.269 } 00:20:25.269 ] 00:20:25.269 } 00:20:25.269 } 00:20:25.269 }' 00:20:25.269 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:25.526 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:25.526 pt2 00:20:25.526 pt3 00:20:25.526 pt4' 00:20:25.526 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:25.526 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:25.526 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:25.783 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:25.783 "name": "pt1", 00:20:25.783 "aliases": [ 00:20:25.783 "00000000-0000-0000-0000-000000000001" 00:20:25.783 ], 00:20:25.783 "product_name": "passthru", 00:20:25.783 "block_size": 512, 00:20:25.783 "num_blocks": 65536, 00:20:25.783 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:25.783 "assigned_rate_limits": { 00:20:25.783 "rw_ios_per_sec": 0, 00:20:25.783 "rw_mbytes_per_sec": 0, 00:20:25.783 "r_mbytes_per_sec": 0, 00:20:25.783 "w_mbytes_per_sec": 0 00:20:25.783 }, 00:20:25.783 "claimed": true, 00:20:25.783 "claim_type": "exclusive_write", 00:20:25.783 "zoned": false, 00:20:25.783 "supported_io_types": { 00:20:25.783 "read": true, 00:20:25.783 "write": true, 00:20:25.783 "unmap": true, 00:20:25.783 "flush": true, 00:20:25.783 "reset": true, 00:20:25.783 "nvme_admin": false, 00:20:25.783 "nvme_io": false, 00:20:25.783 "nvme_io_md": false, 00:20:25.783 "write_zeroes": true, 00:20:25.783 "zcopy": true, 00:20:25.783 "get_zone_info": false, 00:20:25.783 "zone_management": false, 00:20:25.783 "zone_append": false, 00:20:25.783 "compare": false, 00:20:25.783 "compare_and_write": false, 00:20:25.783 "abort": true, 00:20:25.783 "seek_hole": false, 00:20:25.783 "seek_data": false, 00:20:25.783 "copy": true, 00:20:25.783 "nvme_iov_md": false 00:20:25.783 }, 00:20:25.783 "memory_domains": [ 00:20:25.783 { 00:20:25.783 "dma_device_id": "system", 00:20:25.783 "dma_device_type": 1 00:20:25.783 }, 00:20:25.783 { 00:20:25.783 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:25.783 "dma_device_type": 2 00:20:25.783 } 00:20:25.783 ], 00:20:25.783 "driver_specific": { 00:20:25.783 "passthru": { 00:20:25.783 "name": "pt1", 00:20:25.783 "base_bdev_name": "malloc1" 00:20:25.783 } 00:20:25.783 } 00:20:25.783 }' 00:20:25.783 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:25.783 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:25.783 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:25.783 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:25.783 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:25.783 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:25.783 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:25.783 10:47:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:26.040 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:26.040 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:26.040 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:26.040 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:26.040 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:26.040 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:26.040 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:26.297 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:26.297 "name": "pt2", 00:20:26.297 "aliases": [ 00:20:26.297 "00000000-0000-0000-0000-000000000002" 00:20:26.297 ], 00:20:26.297 "product_name": "passthru", 00:20:26.297 "block_size": 512, 00:20:26.297 "num_blocks": 65536, 00:20:26.297 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:26.297 "assigned_rate_limits": { 00:20:26.297 "rw_ios_per_sec": 0, 00:20:26.297 "rw_mbytes_per_sec": 0, 00:20:26.297 "r_mbytes_per_sec": 0, 00:20:26.297 "w_mbytes_per_sec": 0 00:20:26.297 }, 00:20:26.297 "claimed": true, 00:20:26.297 "claim_type": "exclusive_write", 00:20:26.297 "zoned": false, 00:20:26.297 "supported_io_types": { 00:20:26.297 "read": true, 00:20:26.297 "write": true, 00:20:26.297 "unmap": true, 00:20:26.297 "flush": true, 00:20:26.297 "reset": true, 00:20:26.297 "nvme_admin": false, 00:20:26.297 "nvme_io": false, 00:20:26.297 "nvme_io_md": false, 00:20:26.297 "write_zeroes": true, 00:20:26.297 "zcopy": true, 00:20:26.297 "get_zone_info": false, 00:20:26.297 "zone_management": false, 00:20:26.297 "zone_append": false, 00:20:26.297 "compare": false, 00:20:26.297 "compare_and_write": false, 00:20:26.297 "abort": true, 00:20:26.297 "seek_hole": false, 00:20:26.297 "seek_data": false, 00:20:26.297 "copy": true, 00:20:26.297 "nvme_iov_md": false 00:20:26.297 }, 00:20:26.297 "memory_domains": [ 00:20:26.297 { 00:20:26.297 "dma_device_id": "system", 00:20:26.297 "dma_device_type": 1 00:20:26.297 }, 00:20:26.297 { 00:20:26.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:26.297 "dma_device_type": 2 00:20:26.297 } 00:20:26.297 ], 00:20:26.297 "driver_specific": { 00:20:26.297 "passthru": { 00:20:26.297 "name": "pt2", 00:20:26.297 "base_bdev_name": "malloc2" 00:20:26.297 } 00:20:26.297 } 00:20:26.297 }' 00:20:26.297 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:26.297 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:26.297 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:26.297 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:26.297 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:26.555 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:26.813 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:26.813 "name": "pt3", 00:20:26.813 "aliases": [ 00:20:26.813 "00000000-0000-0000-0000-000000000003" 00:20:26.813 ], 00:20:26.813 "product_name": "passthru", 00:20:26.813 "block_size": 512, 00:20:26.813 "num_blocks": 65536, 00:20:26.813 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:26.813 "assigned_rate_limits": { 00:20:26.813 "rw_ios_per_sec": 0, 00:20:26.813 "rw_mbytes_per_sec": 0, 00:20:26.813 "r_mbytes_per_sec": 0, 00:20:26.813 "w_mbytes_per_sec": 0 00:20:26.813 }, 00:20:26.813 "claimed": true, 00:20:26.813 "claim_type": "exclusive_write", 00:20:26.813 "zoned": false, 00:20:26.813 "supported_io_types": { 00:20:26.813 "read": true, 00:20:26.813 "write": true, 00:20:26.813 "unmap": true, 00:20:26.813 "flush": true, 00:20:26.813 "reset": true, 00:20:26.813 "nvme_admin": false, 00:20:26.813 "nvme_io": false, 00:20:26.814 "nvme_io_md": false, 00:20:26.814 "write_zeroes": true, 00:20:26.814 "zcopy": true, 00:20:26.814 "get_zone_info": false, 00:20:26.814 "zone_management": false, 00:20:26.814 "zone_append": false, 00:20:26.814 "compare": false, 00:20:26.814 "compare_and_write": false, 00:20:26.814 "abort": true, 00:20:26.814 "seek_hole": false, 00:20:26.814 "seek_data": false, 00:20:26.814 "copy": true, 00:20:26.814 "nvme_iov_md": false 00:20:26.814 }, 00:20:26.814 "memory_domains": [ 00:20:26.814 { 00:20:26.814 "dma_device_id": "system", 00:20:26.814 "dma_device_type": 1 00:20:26.814 }, 00:20:26.814 { 00:20:26.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:26.814 "dma_device_type": 2 00:20:26.814 } 00:20:26.814 ], 00:20:26.814 "driver_specific": { 00:20:26.814 "passthru": { 00:20:26.814 "name": "pt3", 00:20:26.814 "base_bdev_name": "malloc3" 00:20:26.814 } 00:20:26.814 } 00:20:26.814 }' 00:20:26.814 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:26.814 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:26.814 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:26.814 10:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:27.072 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:27.331 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:27.331 "name": "pt4", 00:20:27.331 "aliases": [ 00:20:27.331 "00000000-0000-0000-0000-000000000004" 00:20:27.331 ], 00:20:27.331 "product_name": "passthru", 00:20:27.331 "block_size": 512, 00:20:27.331 "num_blocks": 65536, 00:20:27.331 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:27.331 "assigned_rate_limits": { 00:20:27.331 "rw_ios_per_sec": 0, 00:20:27.331 "rw_mbytes_per_sec": 0, 00:20:27.331 "r_mbytes_per_sec": 0, 00:20:27.331 "w_mbytes_per_sec": 0 00:20:27.331 }, 00:20:27.331 "claimed": true, 00:20:27.331 "claim_type": "exclusive_write", 00:20:27.331 "zoned": false, 00:20:27.331 "supported_io_types": { 00:20:27.331 "read": true, 00:20:27.331 "write": true, 00:20:27.331 "unmap": true, 00:20:27.331 "flush": true, 00:20:27.331 "reset": true, 00:20:27.331 "nvme_admin": false, 00:20:27.331 "nvme_io": false, 00:20:27.331 "nvme_io_md": false, 00:20:27.331 "write_zeroes": true, 00:20:27.331 "zcopy": true, 00:20:27.331 "get_zone_info": false, 00:20:27.331 "zone_management": false, 00:20:27.331 "zone_append": false, 00:20:27.331 "compare": false, 00:20:27.331 "compare_and_write": false, 00:20:27.331 "abort": true, 00:20:27.331 "seek_hole": false, 00:20:27.331 "seek_data": false, 00:20:27.331 "copy": true, 00:20:27.331 "nvme_iov_md": false 00:20:27.331 }, 00:20:27.331 "memory_domains": [ 00:20:27.331 { 00:20:27.331 "dma_device_id": "system", 00:20:27.331 "dma_device_type": 1 00:20:27.331 }, 00:20:27.331 { 00:20:27.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:27.331 "dma_device_type": 2 00:20:27.331 } 00:20:27.331 ], 00:20:27.331 "driver_specific": { 00:20:27.331 "passthru": { 00:20:27.331 "name": "pt4", 00:20:27.331 "base_bdev_name": "malloc4" 00:20:27.331 } 00:20:27.331 } 00:20:27.331 }' 00:20:27.331 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:27.590 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:27.590 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:27.590 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:27.590 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:27.590 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:27.590 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:27.590 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:27.590 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:27.590 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:27.849 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:27.849 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:27.849 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:27.849 10:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:28.108 [2024-07-12 10:47:03.074349] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:28.108 10:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 53616ce3-9dd3-4018-9869-e60466ff237b '!=' 53616ce3-9dd3-4018-9869-e60466ff237b ']' 00:20:28.108 10:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:20:28.108 10:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:28.108 10:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:28.108 10:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2100248 00:20:28.108 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2100248 ']' 00:20:28.108 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2100248 00:20:28.109 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:28.109 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:28.109 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2100248 00:20:28.109 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:28.109 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:28.109 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2100248' 00:20:28.109 killing process with pid 2100248 00:20:28.109 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2100248 00:20:28.109 [2024-07-12 10:47:03.155692] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:28.109 [2024-07-12 10:47:03.155757] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:28.109 [2024-07-12 10:47:03.155818] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:28.109 [2024-07-12 10:47:03.155832] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f38f0 name raid_bdev1, state offline 00:20:28.109 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2100248 00:20:28.109 [2024-07-12 10:47:03.194453] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:28.367 10:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:28.367 00:20:28.367 real 0m17.438s 00:20:28.367 user 0m31.537s 00:20:28.367 sys 0m3.063s 00:20:28.368 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:28.368 10:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.368 ************************************ 00:20:28.368 END TEST raid_superblock_test 00:20:28.368 ************************************ 00:20:28.368 10:47:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:28.368 10:47:03 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:20:28.368 10:47:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:28.368 10:47:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:28.368 10:47:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:28.368 ************************************ 00:20:28.368 START TEST raid_read_error_test 00:20:28.368 ************************************ 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.JwNPOsgF5Y 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2102850 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2102850 /var/tmp/spdk-raid.sock 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2102850 ']' 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:28.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:28.368 10:47:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.626 [2024-07-12 10:47:03.584447] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:20:28.626 [2024-07-12 10:47:03.584524] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2102850 ] 00:20:28.626 [2024-07-12 10:47:03.699862] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:28.626 [2024-07-12 10:47:03.800503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:28.885 [2024-07-12 10:47:03.856141] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:28.885 [2024-07-12 10:47:03.856172] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:29.453 10:47:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:29.453 10:47:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:29.453 10:47:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:29.453 10:47:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:29.711 BaseBdev1_malloc 00:20:29.711 10:47:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:29.970 true 00:20:29.970 10:47:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:30.229 [2024-07-12 10:47:05.232306] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:30.229 [2024-07-12 10:47:05.232354] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:30.229 [2024-07-12 10:47:05.232379] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16350d0 00:20:30.229 [2024-07-12 10:47:05.232391] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:30.229 [2024-07-12 10:47:05.234085] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:30.229 [2024-07-12 10:47:05.234115] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:30.229 BaseBdev1 00:20:30.229 10:47:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:30.229 10:47:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:30.229 BaseBdev2_malloc 00:20:30.229 10:47:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:30.487 true 00:20:30.487 10:47:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:30.744 [2024-07-12 10:47:05.899948] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:30.744 [2024-07-12 10:47:05.899995] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:30.744 [2024-07-12 10:47:05.900016] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1639910 00:20:30.744 [2024-07-12 10:47:05.900029] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:30.744 [2024-07-12 10:47:05.901515] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:30.744 [2024-07-12 10:47:05.901544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:30.744 BaseBdev2 00:20:30.744 10:47:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:30.744 10:47:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:31.002 BaseBdev3_malloc 00:20:31.002 10:47:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:31.259 true 00:20:31.259 10:47:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:31.517 [2024-07-12 10:47:06.642379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:31.517 [2024-07-12 10:47:06.642423] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:31.517 [2024-07-12 10:47:06.642442] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x163bbd0 00:20:31.517 [2024-07-12 10:47:06.642454] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:31.517 [2024-07-12 10:47:06.643824] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:31.517 [2024-07-12 10:47:06.643854] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:31.517 BaseBdev3 00:20:31.517 10:47:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:31.517 10:47:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:31.775 BaseBdev4_malloc 00:20:31.775 10:47:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:32.033 true 00:20:32.033 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:32.291 [2024-07-12 10:47:07.384894] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:32.291 [2024-07-12 10:47:07.384946] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:32.291 [2024-07-12 10:47:07.384966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x163caa0 00:20:32.291 [2024-07-12 10:47:07.384979] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:32.291 [2024-07-12 10:47:07.386400] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:32.291 [2024-07-12 10:47:07.386429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:32.291 BaseBdev4 00:20:32.291 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:32.549 [2024-07-12 10:47:07.629574] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:32.549 [2024-07-12 10:47:07.630754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:32.549 [2024-07-12 10:47:07.630820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:32.549 [2024-07-12 10:47:07.630882] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:32.549 [2024-07-12 10:47:07.631109] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1636c20 00:20:32.549 [2024-07-12 10:47:07.631120] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:32.549 [2024-07-12 10:47:07.631297] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x148b260 00:20:32.549 [2024-07-12 10:47:07.631437] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1636c20 00:20:32.549 [2024-07-12 10:47:07.631447] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1636c20 00:20:32.549 [2024-07-12 10:47:07.631551] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:32.549 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:32.550 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.808 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:32.808 "name": "raid_bdev1", 00:20:32.808 "uuid": "df5f4d08-0231-4406-922e-e243dbe53a47", 00:20:32.808 "strip_size_kb": 64, 00:20:32.808 "state": "online", 00:20:32.808 "raid_level": "concat", 00:20:32.808 "superblock": true, 00:20:32.808 "num_base_bdevs": 4, 00:20:32.808 "num_base_bdevs_discovered": 4, 00:20:32.808 "num_base_bdevs_operational": 4, 00:20:32.808 "base_bdevs_list": [ 00:20:32.808 { 00:20:32.808 "name": "BaseBdev1", 00:20:32.808 "uuid": "08789000-00de-52b8-958c-249af85afe5e", 00:20:32.808 "is_configured": true, 00:20:32.808 "data_offset": 2048, 00:20:32.808 "data_size": 63488 00:20:32.808 }, 00:20:32.808 { 00:20:32.808 "name": "BaseBdev2", 00:20:32.808 "uuid": "cac78519-7833-5f60-9850-81bedd372b3b", 00:20:32.808 "is_configured": true, 00:20:32.808 "data_offset": 2048, 00:20:32.808 "data_size": 63488 00:20:32.808 }, 00:20:32.808 { 00:20:32.808 "name": "BaseBdev3", 00:20:32.808 "uuid": "eb81e721-e4b9-57cf-8888-edbb1b133940", 00:20:32.808 "is_configured": true, 00:20:32.808 "data_offset": 2048, 00:20:32.808 "data_size": 63488 00:20:32.808 }, 00:20:32.808 { 00:20:32.808 "name": "BaseBdev4", 00:20:32.808 "uuid": "10efdc16-2325-53d2-8bb2-2bb5ef980fd4", 00:20:32.808 "is_configured": true, 00:20:32.808 "data_offset": 2048, 00:20:32.808 "data_size": 63488 00:20:32.808 } 00:20:32.808 ] 00:20:32.808 }' 00:20:32.808 10:47:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:32.808 10:47:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.375 10:47:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:33.375 10:47:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:33.633 [2024-07-12 10:47:08.592410] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1628fc0 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.598 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.856 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.856 "name": "raid_bdev1", 00:20:34.856 "uuid": "df5f4d08-0231-4406-922e-e243dbe53a47", 00:20:34.856 "strip_size_kb": 64, 00:20:34.856 "state": "online", 00:20:34.856 "raid_level": "concat", 00:20:34.856 "superblock": true, 00:20:34.856 "num_base_bdevs": 4, 00:20:34.856 "num_base_bdevs_discovered": 4, 00:20:34.856 "num_base_bdevs_operational": 4, 00:20:34.856 "base_bdevs_list": [ 00:20:34.856 { 00:20:34.856 "name": "BaseBdev1", 00:20:34.856 "uuid": "08789000-00de-52b8-958c-249af85afe5e", 00:20:34.856 "is_configured": true, 00:20:34.856 "data_offset": 2048, 00:20:34.856 "data_size": 63488 00:20:34.856 }, 00:20:34.856 { 00:20:34.856 "name": "BaseBdev2", 00:20:34.856 "uuid": "cac78519-7833-5f60-9850-81bedd372b3b", 00:20:34.856 "is_configured": true, 00:20:34.856 "data_offset": 2048, 00:20:34.856 "data_size": 63488 00:20:34.856 }, 00:20:34.856 { 00:20:34.856 "name": "BaseBdev3", 00:20:34.856 "uuid": "eb81e721-e4b9-57cf-8888-edbb1b133940", 00:20:34.856 "is_configured": true, 00:20:34.856 "data_offset": 2048, 00:20:34.856 "data_size": 63488 00:20:34.856 }, 00:20:34.856 { 00:20:34.856 "name": "BaseBdev4", 00:20:34.856 "uuid": "10efdc16-2325-53d2-8bb2-2bb5ef980fd4", 00:20:34.856 "is_configured": true, 00:20:34.856 "data_offset": 2048, 00:20:34.856 "data_size": 63488 00:20:34.856 } 00:20:34.856 ] 00:20:34.856 }' 00:20:34.856 10:47:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.856 10:47:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:35.422 10:47:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:35.682 [2024-07-12 10:47:10.839256] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:35.682 [2024-07-12 10:47:10.839302] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:35.682 [2024-07-12 10:47:10.842486] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:35.682 [2024-07-12 10:47:10.842526] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:35.682 [2024-07-12 10:47:10.842569] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:35.682 [2024-07-12 10:47:10.842580] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1636c20 name raid_bdev1, state offline 00:20:35.682 0 00:20:35.682 10:47:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2102850 00:20:35.682 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2102850 ']' 00:20:35.682 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2102850 00:20:35.682 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:35.682 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:35.682 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2102850 00:20:35.941 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:35.941 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:35.941 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2102850' 00:20:35.941 killing process with pid 2102850 00:20:35.941 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2102850 00:20:35.941 [2024-07-12 10:47:10.904591] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:35.942 10:47:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2102850 00:20:35.942 [2024-07-12 10:47:10.936257] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.JwNPOsgF5Y 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:20:36.200 00:20:36.200 real 0m7.672s 00:20:36.200 user 0m12.308s 00:20:36.200 sys 0m1.307s 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:36.200 10:47:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:36.200 ************************************ 00:20:36.200 END TEST raid_read_error_test 00:20:36.200 ************************************ 00:20:36.200 10:47:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:36.200 10:47:11 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:20:36.200 10:47:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:36.200 10:47:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:36.200 10:47:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:36.200 ************************************ 00:20:36.200 START TEST raid_write_error_test 00:20:36.200 ************************************ 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.200 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.NijlSHV8PG 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2103999 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2103999 /var/tmp/spdk-raid.sock 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2103999 ']' 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:36.201 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:36.201 10:47:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:36.201 [2024-07-12 10:47:11.338934] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:20:36.201 [2024-07-12 10:47:11.338997] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2103999 ] 00:20:36.459 [2024-07-12 10:47:11.465730] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:36.459 [2024-07-12 10:47:11.568469] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:36.459 [2024-07-12 10:47:11.631149] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:36.459 [2024-07-12 10:47:11.631178] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:37.393 10:47:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:37.393 10:47:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:37.393 10:47:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:37.393 10:47:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:37.393 BaseBdev1_malloc 00:20:37.393 10:47:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:37.651 true 00:20:37.651 10:47:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:37.910 [2024-07-12 10:47:12.989728] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:37.910 [2024-07-12 10:47:12.989774] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:37.910 [2024-07-12 10:47:12.989795] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec90d0 00:20:37.910 [2024-07-12 10:47:12.989807] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:37.910 [2024-07-12 10:47:12.991700] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:37.910 [2024-07-12 10:47:12.991732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:37.910 BaseBdev1 00:20:37.910 10:47:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:37.910 10:47:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:38.168 BaseBdev2_malloc 00:20:38.168 10:47:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:38.426 true 00:20:38.426 10:47:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:38.684 [2024-07-12 10:47:13.713410] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:38.684 [2024-07-12 10:47:13.713456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:38.684 [2024-07-12 10:47:13.713476] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xecd910 00:20:38.684 [2024-07-12 10:47:13.713496] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:38.684 [2024-07-12 10:47:13.715060] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:38.684 [2024-07-12 10:47:13.715089] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:38.684 BaseBdev2 00:20:38.684 10:47:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:38.684 10:47:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:38.942 BaseBdev3_malloc 00:20:38.942 10:47:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:39.200 true 00:20:39.200 10:47:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:39.458 [2024-07-12 10:47:14.420045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:39.458 [2024-07-12 10:47:14.420088] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:39.458 [2024-07-12 10:47:14.420115] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xecfbd0 00:20:39.458 [2024-07-12 10:47:14.420127] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:39.458 [2024-07-12 10:47:14.421701] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:39.458 [2024-07-12 10:47:14.421732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:39.458 BaseBdev3 00:20:39.458 10:47:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:39.458 10:47:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:39.716 BaseBdev4_malloc 00:20:39.716 10:47:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:39.716 true 00:20:39.975 10:47:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:39.975 [2024-07-12 10:47:15.139741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:39.975 [2024-07-12 10:47:15.139787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:39.975 [2024-07-12 10:47:15.139808] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed0aa0 00:20:39.975 [2024-07-12 10:47:15.139822] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:39.975 [2024-07-12 10:47:15.141426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:39.975 [2024-07-12 10:47:15.141456] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:39.975 BaseBdev4 00:20:39.975 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:40.233 [2024-07-12 10:47:15.372396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:40.233 [2024-07-12 10:47:15.373788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:40.233 [2024-07-12 10:47:15.373856] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:40.233 [2024-07-12 10:47:15.373917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:40.233 [2024-07-12 10:47:15.374149] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xecac20 00:20:40.233 [2024-07-12 10:47:15.374160] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:40.233 [2024-07-12 10:47:15.374363] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd1f260 00:20:40.233 [2024-07-12 10:47:15.374522] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xecac20 00:20:40.233 [2024-07-12 10:47:15.374532] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xecac20 00:20:40.233 [2024-07-12 10:47:15.374640] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.233 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:40.491 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.491 "name": "raid_bdev1", 00:20:40.491 "uuid": "6c63d51a-7f6a-4689-9372-f9a87731e38a", 00:20:40.491 "strip_size_kb": 64, 00:20:40.491 "state": "online", 00:20:40.491 "raid_level": "concat", 00:20:40.491 "superblock": true, 00:20:40.491 "num_base_bdevs": 4, 00:20:40.491 "num_base_bdevs_discovered": 4, 00:20:40.491 "num_base_bdevs_operational": 4, 00:20:40.491 "base_bdevs_list": [ 00:20:40.491 { 00:20:40.491 "name": "BaseBdev1", 00:20:40.491 "uuid": "5ba83e0d-b74e-5572-81e1-6f4a2a43d5b9", 00:20:40.491 "is_configured": true, 00:20:40.491 "data_offset": 2048, 00:20:40.491 "data_size": 63488 00:20:40.491 }, 00:20:40.491 { 00:20:40.491 "name": "BaseBdev2", 00:20:40.491 "uuid": "c4a027a7-9d71-5182-a8d0-c2c27fb8db81", 00:20:40.491 "is_configured": true, 00:20:40.491 "data_offset": 2048, 00:20:40.491 "data_size": 63488 00:20:40.491 }, 00:20:40.491 { 00:20:40.491 "name": "BaseBdev3", 00:20:40.491 "uuid": "5bd14e8f-438f-5bd6-afaa-5b7dc6f49635", 00:20:40.491 "is_configured": true, 00:20:40.491 "data_offset": 2048, 00:20:40.491 "data_size": 63488 00:20:40.491 }, 00:20:40.491 { 00:20:40.491 "name": "BaseBdev4", 00:20:40.491 "uuid": "4ab28bee-7851-5557-ae6f-7d10b565b119", 00:20:40.491 "is_configured": true, 00:20:40.491 "data_offset": 2048, 00:20:40.491 "data_size": 63488 00:20:40.491 } 00:20:40.491 ] 00:20:40.491 }' 00:20:40.491 10:47:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.491 10:47:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:41.058 10:47:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:41.058 10:47:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:41.316 [2024-07-12 10:47:16.355255] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xebcfc0 00:20:42.248 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.507 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.765 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.765 "name": "raid_bdev1", 00:20:42.765 "uuid": "6c63d51a-7f6a-4689-9372-f9a87731e38a", 00:20:42.765 "strip_size_kb": 64, 00:20:42.765 "state": "online", 00:20:42.765 "raid_level": "concat", 00:20:42.765 "superblock": true, 00:20:42.765 "num_base_bdevs": 4, 00:20:42.765 "num_base_bdevs_discovered": 4, 00:20:42.765 "num_base_bdevs_operational": 4, 00:20:42.765 "base_bdevs_list": [ 00:20:42.765 { 00:20:42.765 "name": "BaseBdev1", 00:20:42.765 "uuid": "5ba83e0d-b74e-5572-81e1-6f4a2a43d5b9", 00:20:42.765 "is_configured": true, 00:20:42.765 "data_offset": 2048, 00:20:42.765 "data_size": 63488 00:20:42.765 }, 00:20:42.765 { 00:20:42.765 "name": "BaseBdev2", 00:20:42.765 "uuid": "c4a027a7-9d71-5182-a8d0-c2c27fb8db81", 00:20:42.765 "is_configured": true, 00:20:42.765 "data_offset": 2048, 00:20:42.765 "data_size": 63488 00:20:42.765 }, 00:20:42.765 { 00:20:42.765 "name": "BaseBdev3", 00:20:42.765 "uuid": "5bd14e8f-438f-5bd6-afaa-5b7dc6f49635", 00:20:42.765 "is_configured": true, 00:20:42.765 "data_offset": 2048, 00:20:42.765 "data_size": 63488 00:20:42.765 }, 00:20:42.765 { 00:20:42.765 "name": "BaseBdev4", 00:20:42.765 "uuid": "4ab28bee-7851-5557-ae6f-7d10b565b119", 00:20:42.765 "is_configured": true, 00:20:42.765 "data_offset": 2048, 00:20:42.765 "data_size": 63488 00:20:42.765 } 00:20:42.765 ] 00:20:42.765 }' 00:20:42.765 10:47:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.765 10:47:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:43.332 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:43.591 [2024-07-12 10:47:18.569189] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:43.591 [2024-07-12 10:47:18.569227] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:43.591 [2024-07-12 10:47:18.572391] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:43.591 [2024-07-12 10:47:18.572429] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:43.591 [2024-07-12 10:47:18.572470] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:43.591 [2024-07-12 10:47:18.572487] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xecac20 name raid_bdev1, state offline 00:20:43.591 0 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2103999 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2103999 ']' 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2103999 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2103999 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2103999' 00:20:43.591 killing process with pid 2103999 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2103999 00:20:43.591 [2024-07-12 10:47:18.638479] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:43.591 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2103999 00:20:43.591 [2024-07-12 10:47:18.670668] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.NijlSHV8PG 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:20:43.851 00:20:43.851 real 0m7.643s 00:20:43.851 user 0m12.232s 00:20:43.851 sys 0m1.343s 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:43.851 10:47:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:43.851 ************************************ 00:20:43.851 END TEST raid_write_error_test 00:20:43.851 ************************************ 00:20:43.851 10:47:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:43.851 10:47:18 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:43.851 10:47:18 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:20:43.851 10:47:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:43.851 10:47:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:43.851 10:47:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:43.851 ************************************ 00:20:43.851 START TEST raid_state_function_test 00:20:43.851 ************************************ 00:20:43.851 10:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:20:43.851 10:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:43.851 10:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:43.851 10:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2104997 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2104997' 00:20:43.851 Process raid pid: 2104997 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2104997 /var/tmp/spdk-raid.sock 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2104997 ']' 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:43.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:43.851 10:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:44.111 [2024-07-12 10:47:19.066910] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:20:44.111 [2024-07-12 10:47:19.066967] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:44.111 [2024-07-12 10:47:19.182247] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:44.111 [2024-07-12 10:47:19.285064] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:44.369 [2024-07-12 10:47:19.351087] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:44.370 [2024-07-12 10:47:19.351117] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:44.937 10:47:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:44.937 10:47:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:44.937 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:45.196 [2024-07-12 10:47:20.240885] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:45.196 [2024-07-12 10:47:20.240934] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:45.196 [2024-07-12 10:47:20.240945] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:45.196 [2024-07-12 10:47:20.240957] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:45.196 [2024-07-12 10:47:20.240966] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:45.196 [2024-07-12 10:47:20.240978] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:45.196 [2024-07-12 10:47:20.240987] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:45.196 [2024-07-12 10:47:20.240998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.196 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:45.455 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.455 "name": "Existed_Raid", 00:20:45.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.455 "strip_size_kb": 0, 00:20:45.455 "state": "configuring", 00:20:45.455 "raid_level": "raid1", 00:20:45.455 "superblock": false, 00:20:45.455 "num_base_bdevs": 4, 00:20:45.455 "num_base_bdevs_discovered": 0, 00:20:45.455 "num_base_bdevs_operational": 4, 00:20:45.455 "base_bdevs_list": [ 00:20:45.455 { 00:20:45.455 "name": "BaseBdev1", 00:20:45.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.455 "is_configured": false, 00:20:45.456 "data_offset": 0, 00:20:45.456 "data_size": 0 00:20:45.456 }, 00:20:45.456 { 00:20:45.456 "name": "BaseBdev2", 00:20:45.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.456 "is_configured": false, 00:20:45.456 "data_offset": 0, 00:20:45.456 "data_size": 0 00:20:45.456 }, 00:20:45.456 { 00:20:45.456 "name": "BaseBdev3", 00:20:45.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.456 "is_configured": false, 00:20:45.456 "data_offset": 0, 00:20:45.456 "data_size": 0 00:20:45.456 }, 00:20:45.456 { 00:20:45.456 "name": "BaseBdev4", 00:20:45.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.456 "is_configured": false, 00:20:45.456 "data_offset": 0, 00:20:45.456 "data_size": 0 00:20:45.456 } 00:20:45.456 ] 00:20:45.456 }' 00:20:45.456 10:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.456 10:47:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:46.023 10:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:46.283 [2024-07-12 10:47:21.319620] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:46.283 [2024-07-12 10:47:21.319653] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8b7aa0 name Existed_Raid, state configuring 00:20:46.283 10:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:46.542 [2024-07-12 10:47:21.564274] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:46.542 [2024-07-12 10:47:21.564306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:46.542 [2024-07-12 10:47:21.564316] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:46.542 [2024-07-12 10:47:21.564328] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:46.542 [2024-07-12 10:47:21.564337] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:46.542 [2024-07-12 10:47:21.564348] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:46.542 [2024-07-12 10:47:21.564356] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:46.542 [2024-07-12 10:47:21.564368] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:46.542 10:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:46.801 [2024-07-12 10:47:21.814769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:46.801 BaseBdev1 00:20:46.801 10:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:46.801 10:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:46.801 10:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:46.801 10:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:46.801 10:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:46.801 10:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:46.801 10:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:47.060 10:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:47.320 [ 00:20:47.320 { 00:20:47.320 "name": "BaseBdev1", 00:20:47.320 "aliases": [ 00:20:47.320 "aba6979f-b70e-4615-9028-e70413518ace" 00:20:47.320 ], 00:20:47.320 "product_name": "Malloc disk", 00:20:47.320 "block_size": 512, 00:20:47.320 "num_blocks": 65536, 00:20:47.320 "uuid": "aba6979f-b70e-4615-9028-e70413518ace", 00:20:47.320 "assigned_rate_limits": { 00:20:47.320 "rw_ios_per_sec": 0, 00:20:47.320 "rw_mbytes_per_sec": 0, 00:20:47.320 "r_mbytes_per_sec": 0, 00:20:47.320 "w_mbytes_per_sec": 0 00:20:47.320 }, 00:20:47.320 "claimed": true, 00:20:47.320 "claim_type": "exclusive_write", 00:20:47.320 "zoned": false, 00:20:47.320 "supported_io_types": { 00:20:47.320 "read": true, 00:20:47.320 "write": true, 00:20:47.320 "unmap": true, 00:20:47.320 "flush": true, 00:20:47.320 "reset": true, 00:20:47.320 "nvme_admin": false, 00:20:47.320 "nvme_io": false, 00:20:47.320 "nvme_io_md": false, 00:20:47.320 "write_zeroes": true, 00:20:47.320 "zcopy": true, 00:20:47.320 "get_zone_info": false, 00:20:47.320 "zone_management": false, 00:20:47.320 "zone_append": false, 00:20:47.320 "compare": false, 00:20:47.320 "compare_and_write": false, 00:20:47.320 "abort": true, 00:20:47.320 "seek_hole": false, 00:20:47.320 "seek_data": false, 00:20:47.320 "copy": true, 00:20:47.320 "nvme_iov_md": false 00:20:47.320 }, 00:20:47.320 "memory_domains": [ 00:20:47.320 { 00:20:47.320 "dma_device_id": "system", 00:20:47.320 "dma_device_type": 1 00:20:47.320 }, 00:20:47.320 { 00:20:47.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.320 "dma_device_type": 2 00:20:47.320 } 00:20:47.320 ], 00:20:47.320 "driver_specific": {} 00:20:47.320 } 00:20:47.320 ] 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.320 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.320 "name": "Existed_Raid", 00:20:47.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.320 "strip_size_kb": 0, 00:20:47.320 "state": "configuring", 00:20:47.320 "raid_level": "raid1", 00:20:47.320 "superblock": false, 00:20:47.320 "num_base_bdevs": 4, 00:20:47.320 "num_base_bdevs_discovered": 1, 00:20:47.320 "num_base_bdevs_operational": 4, 00:20:47.320 "base_bdevs_list": [ 00:20:47.320 { 00:20:47.320 "name": "BaseBdev1", 00:20:47.320 "uuid": "aba6979f-b70e-4615-9028-e70413518ace", 00:20:47.320 "is_configured": true, 00:20:47.320 "data_offset": 0, 00:20:47.320 "data_size": 65536 00:20:47.320 }, 00:20:47.320 { 00:20:47.320 "name": "BaseBdev2", 00:20:47.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.321 "is_configured": false, 00:20:47.321 "data_offset": 0, 00:20:47.321 "data_size": 0 00:20:47.321 }, 00:20:47.321 { 00:20:47.321 "name": "BaseBdev3", 00:20:47.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.321 "is_configured": false, 00:20:47.321 "data_offset": 0, 00:20:47.321 "data_size": 0 00:20:47.321 }, 00:20:47.321 { 00:20:47.321 "name": "BaseBdev4", 00:20:47.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.321 "is_configured": false, 00:20:47.321 "data_offset": 0, 00:20:47.321 "data_size": 0 00:20:47.321 } 00:20:47.321 ] 00:20:47.321 }' 00:20:47.321 10:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.321 10:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:48.289 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:48.289 [2024-07-12 10:47:23.306716] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:48.289 [2024-07-12 10:47:23.306759] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8b7310 name Existed_Raid, state configuring 00:20:48.289 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:48.560 [2024-07-12 10:47:23.543364] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:48.560 [2024-07-12 10:47:23.544824] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:48.560 [2024-07-12 10:47:23.544859] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:48.560 [2024-07-12 10:47:23.544870] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:48.560 [2024-07-12 10:47:23.544882] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:48.560 [2024-07-12 10:47:23.544892] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:48.560 [2024-07-12 10:47:23.544903] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.560 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:48.816 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.816 "name": "Existed_Raid", 00:20:48.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.816 "strip_size_kb": 0, 00:20:48.816 "state": "configuring", 00:20:48.816 "raid_level": "raid1", 00:20:48.816 "superblock": false, 00:20:48.816 "num_base_bdevs": 4, 00:20:48.816 "num_base_bdevs_discovered": 1, 00:20:48.816 "num_base_bdevs_operational": 4, 00:20:48.816 "base_bdevs_list": [ 00:20:48.816 { 00:20:48.817 "name": "BaseBdev1", 00:20:48.817 "uuid": "aba6979f-b70e-4615-9028-e70413518ace", 00:20:48.817 "is_configured": true, 00:20:48.817 "data_offset": 0, 00:20:48.817 "data_size": 65536 00:20:48.817 }, 00:20:48.817 { 00:20:48.817 "name": "BaseBdev2", 00:20:48.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.817 "is_configured": false, 00:20:48.817 "data_offset": 0, 00:20:48.817 "data_size": 0 00:20:48.817 }, 00:20:48.817 { 00:20:48.817 "name": "BaseBdev3", 00:20:48.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.817 "is_configured": false, 00:20:48.817 "data_offset": 0, 00:20:48.817 "data_size": 0 00:20:48.817 }, 00:20:48.817 { 00:20:48.817 "name": "BaseBdev4", 00:20:48.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.817 "is_configured": false, 00:20:48.817 "data_offset": 0, 00:20:48.817 "data_size": 0 00:20:48.817 } 00:20:48.817 ] 00:20:48.817 }' 00:20:48.817 10:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.817 10:47:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:49.383 10:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:49.383 [2024-07-12 10:47:24.537501] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:49.383 BaseBdev2 00:20:49.383 10:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:49.383 10:47:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:49.383 10:47:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:49.383 10:47:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:49.383 10:47:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:49.383 10:47:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:49.383 10:47:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:49.641 10:47:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:49.899 [ 00:20:49.899 { 00:20:49.899 "name": "BaseBdev2", 00:20:49.899 "aliases": [ 00:20:49.899 "665db6b8-18d3-4e47-9680-ce4535d0650e" 00:20:49.899 ], 00:20:49.899 "product_name": "Malloc disk", 00:20:49.899 "block_size": 512, 00:20:49.899 "num_blocks": 65536, 00:20:49.899 "uuid": "665db6b8-18d3-4e47-9680-ce4535d0650e", 00:20:49.899 "assigned_rate_limits": { 00:20:49.899 "rw_ios_per_sec": 0, 00:20:49.899 "rw_mbytes_per_sec": 0, 00:20:49.899 "r_mbytes_per_sec": 0, 00:20:49.899 "w_mbytes_per_sec": 0 00:20:49.899 }, 00:20:49.899 "claimed": true, 00:20:49.899 "claim_type": "exclusive_write", 00:20:49.899 "zoned": false, 00:20:49.899 "supported_io_types": { 00:20:49.899 "read": true, 00:20:49.899 "write": true, 00:20:49.899 "unmap": true, 00:20:49.899 "flush": true, 00:20:49.899 "reset": true, 00:20:49.899 "nvme_admin": false, 00:20:49.899 "nvme_io": false, 00:20:49.899 "nvme_io_md": false, 00:20:49.899 "write_zeroes": true, 00:20:49.899 "zcopy": true, 00:20:49.899 "get_zone_info": false, 00:20:49.899 "zone_management": false, 00:20:49.899 "zone_append": false, 00:20:49.899 "compare": false, 00:20:49.899 "compare_and_write": false, 00:20:49.899 "abort": true, 00:20:49.899 "seek_hole": false, 00:20:49.899 "seek_data": false, 00:20:49.899 "copy": true, 00:20:49.899 "nvme_iov_md": false 00:20:49.899 }, 00:20:49.899 "memory_domains": [ 00:20:49.899 { 00:20:49.899 "dma_device_id": "system", 00:20:49.899 "dma_device_type": 1 00:20:49.899 }, 00:20:49.899 { 00:20:49.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.899 "dma_device_type": 2 00:20:49.899 } 00:20:49.899 ], 00:20:49.899 "driver_specific": {} 00:20:49.899 } 00:20:49.899 ] 00:20:49.899 10:47:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:49.899 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:49.899 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:49.899 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:49.899 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:49.899 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:49.899 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:49.899 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:49.899 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:49.900 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:49.900 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:49.900 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:49.900 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:49.900 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.900 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:50.158 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.158 "name": "Existed_Raid", 00:20:50.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.158 "strip_size_kb": 0, 00:20:50.158 "state": "configuring", 00:20:50.158 "raid_level": "raid1", 00:20:50.158 "superblock": false, 00:20:50.158 "num_base_bdevs": 4, 00:20:50.158 "num_base_bdevs_discovered": 2, 00:20:50.158 "num_base_bdevs_operational": 4, 00:20:50.158 "base_bdevs_list": [ 00:20:50.158 { 00:20:50.158 "name": "BaseBdev1", 00:20:50.158 "uuid": "aba6979f-b70e-4615-9028-e70413518ace", 00:20:50.158 "is_configured": true, 00:20:50.158 "data_offset": 0, 00:20:50.158 "data_size": 65536 00:20:50.158 }, 00:20:50.158 { 00:20:50.158 "name": "BaseBdev2", 00:20:50.158 "uuid": "665db6b8-18d3-4e47-9680-ce4535d0650e", 00:20:50.158 "is_configured": true, 00:20:50.158 "data_offset": 0, 00:20:50.158 "data_size": 65536 00:20:50.158 }, 00:20:50.158 { 00:20:50.158 "name": "BaseBdev3", 00:20:50.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.158 "is_configured": false, 00:20:50.158 "data_offset": 0, 00:20:50.158 "data_size": 0 00:20:50.158 }, 00:20:50.158 { 00:20:50.158 "name": "BaseBdev4", 00:20:50.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.158 "is_configured": false, 00:20:50.158 "data_offset": 0, 00:20:50.158 "data_size": 0 00:20:50.158 } 00:20:50.158 ] 00:20:50.158 }' 00:20:50.158 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.158 10:47:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.723 10:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:50.982 [2024-07-12 10:47:26.137091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:50.982 BaseBdev3 00:20:50.982 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:50.982 10:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:50.982 10:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:50.982 10:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:50.982 10:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:50.982 10:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:50.982 10:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:51.240 10:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:51.497 [ 00:20:51.497 { 00:20:51.497 "name": "BaseBdev3", 00:20:51.497 "aliases": [ 00:20:51.497 "442f6b7e-08ae-4fbb-a699-bfcdb11e92ad" 00:20:51.497 ], 00:20:51.497 "product_name": "Malloc disk", 00:20:51.497 "block_size": 512, 00:20:51.497 "num_blocks": 65536, 00:20:51.497 "uuid": "442f6b7e-08ae-4fbb-a699-bfcdb11e92ad", 00:20:51.497 "assigned_rate_limits": { 00:20:51.497 "rw_ios_per_sec": 0, 00:20:51.497 "rw_mbytes_per_sec": 0, 00:20:51.497 "r_mbytes_per_sec": 0, 00:20:51.497 "w_mbytes_per_sec": 0 00:20:51.497 }, 00:20:51.497 "claimed": true, 00:20:51.497 "claim_type": "exclusive_write", 00:20:51.497 "zoned": false, 00:20:51.497 "supported_io_types": { 00:20:51.497 "read": true, 00:20:51.497 "write": true, 00:20:51.497 "unmap": true, 00:20:51.497 "flush": true, 00:20:51.497 "reset": true, 00:20:51.497 "nvme_admin": false, 00:20:51.498 "nvme_io": false, 00:20:51.498 "nvme_io_md": false, 00:20:51.498 "write_zeroes": true, 00:20:51.498 "zcopy": true, 00:20:51.498 "get_zone_info": false, 00:20:51.498 "zone_management": false, 00:20:51.498 "zone_append": false, 00:20:51.498 "compare": false, 00:20:51.498 "compare_and_write": false, 00:20:51.498 "abort": true, 00:20:51.498 "seek_hole": false, 00:20:51.498 "seek_data": false, 00:20:51.498 "copy": true, 00:20:51.498 "nvme_iov_md": false 00:20:51.498 }, 00:20:51.498 "memory_domains": [ 00:20:51.498 { 00:20:51.498 "dma_device_id": "system", 00:20:51.498 "dma_device_type": 1 00:20:51.498 }, 00:20:51.498 { 00:20:51.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.498 "dma_device_type": 2 00:20:51.498 } 00:20:51.498 ], 00:20:51.498 "driver_specific": {} 00:20:51.498 } 00:20:51.498 ] 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.498 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:51.764 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:51.764 "name": "Existed_Raid", 00:20:51.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.764 "strip_size_kb": 0, 00:20:51.764 "state": "configuring", 00:20:51.764 "raid_level": "raid1", 00:20:51.764 "superblock": false, 00:20:51.764 "num_base_bdevs": 4, 00:20:51.764 "num_base_bdevs_discovered": 3, 00:20:51.764 "num_base_bdevs_operational": 4, 00:20:51.764 "base_bdevs_list": [ 00:20:51.764 { 00:20:51.764 "name": "BaseBdev1", 00:20:51.764 "uuid": "aba6979f-b70e-4615-9028-e70413518ace", 00:20:51.764 "is_configured": true, 00:20:51.764 "data_offset": 0, 00:20:51.764 "data_size": 65536 00:20:51.764 }, 00:20:51.764 { 00:20:51.764 "name": "BaseBdev2", 00:20:51.764 "uuid": "665db6b8-18d3-4e47-9680-ce4535d0650e", 00:20:51.764 "is_configured": true, 00:20:51.764 "data_offset": 0, 00:20:51.764 "data_size": 65536 00:20:51.764 }, 00:20:51.764 { 00:20:51.764 "name": "BaseBdev3", 00:20:51.764 "uuid": "442f6b7e-08ae-4fbb-a699-bfcdb11e92ad", 00:20:51.764 "is_configured": true, 00:20:51.764 "data_offset": 0, 00:20:51.764 "data_size": 65536 00:20:51.764 }, 00:20:51.764 { 00:20:51.764 "name": "BaseBdev4", 00:20:51.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.764 "is_configured": false, 00:20:51.764 "data_offset": 0, 00:20:51.764 "data_size": 0 00:20:51.764 } 00:20:51.764 ] 00:20:51.764 }' 00:20:51.764 10:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:51.764 10:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:52.328 10:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:52.585 [2024-07-12 10:47:27.660532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:52.585 [2024-07-12 10:47:27.660578] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8b8350 00:20:52.585 [2024-07-12 10:47:27.660587] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:52.585 [2024-07-12 10:47:27.660848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8b8020 00:20:52.585 [2024-07-12 10:47:27.660979] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8b8350 00:20:52.585 [2024-07-12 10:47:27.660989] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8b8350 00:20:52.585 [2024-07-12 10:47:27.661154] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:52.585 BaseBdev4 00:20:52.585 10:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:52.585 10:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:52.586 10:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:52.586 10:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:52.586 10:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:52.586 10:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:52.586 10:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:52.843 10:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:53.101 [ 00:20:53.101 { 00:20:53.101 "name": "BaseBdev4", 00:20:53.101 "aliases": [ 00:20:53.101 "71a641fe-f6c8-4d62-9174-70b6467eab8c" 00:20:53.101 ], 00:20:53.101 "product_name": "Malloc disk", 00:20:53.101 "block_size": 512, 00:20:53.101 "num_blocks": 65536, 00:20:53.101 "uuid": "71a641fe-f6c8-4d62-9174-70b6467eab8c", 00:20:53.101 "assigned_rate_limits": { 00:20:53.101 "rw_ios_per_sec": 0, 00:20:53.101 "rw_mbytes_per_sec": 0, 00:20:53.101 "r_mbytes_per_sec": 0, 00:20:53.101 "w_mbytes_per_sec": 0 00:20:53.101 }, 00:20:53.101 "claimed": true, 00:20:53.101 "claim_type": "exclusive_write", 00:20:53.101 "zoned": false, 00:20:53.101 "supported_io_types": { 00:20:53.101 "read": true, 00:20:53.101 "write": true, 00:20:53.101 "unmap": true, 00:20:53.101 "flush": true, 00:20:53.101 "reset": true, 00:20:53.101 "nvme_admin": false, 00:20:53.101 "nvme_io": false, 00:20:53.101 "nvme_io_md": false, 00:20:53.101 "write_zeroes": true, 00:20:53.101 "zcopy": true, 00:20:53.101 "get_zone_info": false, 00:20:53.101 "zone_management": false, 00:20:53.101 "zone_append": false, 00:20:53.101 "compare": false, 00:20:53.101 "compare_and_write": false, 00:20:53.101 "abort": true, 00:20:53.101 "seek_hole": false, 00:20:53.101 "seek_data": false, 00:20:53.101 "copy": true, 00:20:53.101 "nvme_iov_md": false 00:20:53.101 }, 00:20:53.101 "memory_domains": [ 00:20:53.101 { 00:20:53.101 "dma_device_id": "system", 00:20:53.101 "dma_device_type": 1 00:20:53.101 }, 00:20:53.101 { 00:20:53.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.101 "dma_device_type": 2 00:20:53.101 } 00:20:53.101 ], 00:20:53.101 "driver_specific": {} 00:20:53.101 } 00:20:53.101 ] 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.101 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.358 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.358 "name": "Existed_Raid", 00:20:53.358 "uuid": "082bd47a-3c17-45db-b184-baaa058db7a5", 00:20:53.358 "strip_size_kb": 0, 00:20:53.358 "state": "online", 00:20:53.358 "raid_level": "raid1", 00:20:53.358 "superblock": false, 00:20:53.358 "num_base_bdevs": 4, 00:20:53.358 "num_base_bdevs_discovered": 4, 00:20:53.358 "num_base_bdevs_operational": 4, 00:20:53.358 "base_bdevs_list": [ 00:20:53.358 { 00:20:53.358 "name": "BaseBdev1", 00:20:53.358 "uuid": "aba6979f-b70e-4615-9028-e70413518ace", 00:20:53.358 "is_configured": true, 00:20:53.358 "data_offset": 0, 00:20:53.358 "data_size": 65536 00:20:53.358 }, 00:20:53.358 { 00:20:53.358 "name": "BaseBdev2", 00:20:53.358 "uuid": "665db6b8-18d3-4e47-9680-ce4535d0650e", 00:20:53.358 "is_configured": true, 00:20:53.358 "data_offset": 0, 00:20:53.358 "data_size": 65536 00:20:53.358 }, 00:20:53.358 { 00:20:53.358 "name": "BaseBdev3", 00:20:53.358 "uuid": "442f6b7e-08ae-4fbb-a699-bfcdb11e92ad", 00:20:53.358 "is_configured": true, 00:20:53.358 "data_offset": 0, 00:20:53.358 "data_size": 65536 00:20:53.358 }, 00:20:53.358 { 00:20:53.358 "name": "BaseBdev4", 00:20:53.358 "uuid": "71a641fe-f6c8-4d62-9174-70b6467eab8c", 00:20:53.358 "is_configured": true, 00:20:53.358 "data_offset": 0, 00:20:53.358 "data_size": 65536 00:20:53.358 } 00:20:53.358 ] 00:20:53.358 }' 00:20:53.358 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.358 10:47:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:53.922 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:53.922 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:53.922 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:53.922 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:53.922 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:53.922 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:53.922 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:53.922 10:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:54.180 [2024-07-12 10:47:29.193073] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:54.180 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:54.180 "name": "Existed_Raid", 00:20:54.180 "aliases": [ 00:20:54.180 "082bd47a-3c17-45db-b184-baaa058db7a5" 00:20:54.180 ], 00:20:54.180 "product_name": "Raid Volume", 00:20:54.180 "block_size": 512, 00:20:54.180 "num_blocks": 65536, 00:20:54.180 "uuid": "082bd47a-3c17-45db-b184-baaa058db7a5", 00:20:54.180 "assigned_rate_limits": { 00:20:54.180 "rw_ios_per_sec": 0, 00:20:54.180 "rw_mbytes_per_sec": 0, 00:20:54.180 "r_mbytes_per_sec": 0, 00:20:54.180 "w_mbytes_per_sec": 0 00:20:54.180 }, 00:20:54.180 "claimed": false, 00:20:54.180 "zoned": false, 00:20:54.180 "supported_io_types": { 00:20:54.180 "read": true, 00:20:54.180 "write": true, 00:20:54.180 "unmap": false, 00:20:54.180 "flush": false, 00:20:54.180 "reset": true, 00:20:54.180 "nvme_admin": false, 00:20:54.180 "nvme_io": false, 00:20:54.180 "nvme_io_md": false, 00:20:54.180 "write_zeroes": true, 00:20:54.180 "zcopy": false, 00:20:54.180 "get_zone_info": false, 00:20:54.180 "zone_management": false, 00:20:54.180 "zone_append": false, 00:20:54.180 "compare": false, 00:20:54.180 "compare_and_write": false, 00:20:54.180 "abort": false, 00:20:54.180 "seek_hole": false, 00:20:54.180 "seek_data": false, 00:20:54.180 "copy": false, 00:20:54.180 "nvme_iov_md": false 00:20:54.180 }, 00:20:54.180 "memory_domains": [ 00:20:54.180 { 00:20:54.180 "dma_device_id": "system", 00:20:54.180 "dma_device_type": 1 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.180 "dma_device_type": 2 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "dma_device_id": "system", 00:20:54.180 "dma_device_type": 1 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.180 "dma_device_type": 2 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "dma_device_id": "system", 00:20:54.180 "dma_device_type": 1 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.180 "dma_device_type": 2 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "dma_device_id": "system", 00:20:54.180 "dma_device_type": 1 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.180 "dma_device_type": 2 00:20:54.180 } 00:20:54.180 ], 00:20:54.180 "driver_specific": { 00:20:54.180 "raid": { 00:20:54.180 "uuid": "082bd47a-3c17-45db-b184-baaa058db7a5", 00:20:54.180 "strip_size_kb": 0, 00:20:54.180 "state": "online", 00:20:54.180 "raid_level": "raid1", 00:20:54.180 "superblock": false, 00:20:54.180 "num_base_bdevs": 4, 00:20:54.180 "num_base_bdevs_discovered": 4, 00:20:54.180 "num_base_bdevs_operational": 4, 00:20:54.180 "base_bdevs_list": [ 00:20:54.180 { 00:20:54.180 "name": "BaseBdev1", 00:20:54.180 "uuid": "aba6979f-b70e-4615-9028-e70413518ace", 00:20:54.180 "is_configured": true, 00:20:54.180 "data_offset": 0, 00:20:54.180 "data_size": 65536 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "name": "BaseBdev2", 00:20:54.180 "uuid": "665db6b8-18d3-4e47-9680-ce4535d0650e", 00:20:54.180 "is_configured": true, 00:20:54.180 "data_offset": 0, 00:20:54.180 "data_size": 65536 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "name": "BaseBdev3", 00:20:54.180 "uuid": "442f6b7e-08ae-4fbb-a699-bfcdb11e92ad", 00:20:54.180 "is_configured": true, 00:20:54.180 "data_offset": 0, 00:20:54.180 "data_size": 65536 00:20:54.180 }, 00:20:54.180 { 00:20:54.180 "name": "BaseBdev4", 00:20:54.180 "uuid": "71a641fe-f6c8-4d62-9174-70b6467eab8c", 00:20:54.180 "is_configured": true, 00:20:54.180 "data_offset": 0, 00:20:54.180 "data_size": 65536 00:20:54.180 } 00:20:54.180 ] 00:20:54.180 } 00:20:54.180 } 00:20:54.180 }' 00:20:54.180 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:54.180 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:54.180 BaseBdev2 00:20:54.180 BaseBdev3 00:20:54.180 BaseBdev4' 00:20:54.180 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.180 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:54.180 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.437 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.437 "name": "BaseBdev1", 00:20:54.437 "aliases": [ 00:20:54.437 "aba6979f-b70e-4615-9028-e70413518ace" 00:20:54.437 ], 00:20:54.437 "product_name": "Malloc disk", 00:20:54.437 "block_size": 512, 00:20:54.437 "num_blocks": 65536, 00:20:54.437 "uuid": "aba6979f-b70e-4615-9028-e70413518ace", 00:20:54.437 "assigned_rate_limits": { 00:20:54.437 "rw_ios_per_sec": 0, 00:20:54.437 "rw_mbytes_per_sec": 0, 00:20:54.437 "r_mbytes_per_sec": 0, 00:20:54.437 "w_mbytes_per_sec": 0 00:20:54.437 }, 00:20:54.437 "claimed": true, 00:20:54.437 "claim_type": "exclusive_write", 00:20:54.437 "zoned": false, 00:20:54.437 "supported_io_types": { 00:20:54.437 "read": true, 00:20:54.437 "write": true, 00:20:54.437 "unmap": true, 00:20:54.437 "flush": true, 00:20:54.437 "reset": true, 00:20:54.437 "nvme_admin": false, 00:20:54.437 "nvme_io": false, 00:20:54.437 "nvme_io_md": false, 00:20:54.437 "write_zeroes": true, 00:20:54.437 "zcopy": true, 00:20:54.437 "get_zone_info": false, 00:20:54.437 "zone_management": false, 00:20:54.437 "zone_append": false, 00:20:54.437 "compare": false, 00:20:54.437 "compare_and_write": false, 00:20:54.437 "abort": true, 00:20:54.437 "seek_hole": false, 00:20:54.437 "seek_data": false, 00:20:54.437 "copy": true, 00:20:54.437 "nvme_iov_md": false 00:20:54.437 }, 00:20:54.437 "memory_domains": [ 00:20:54.437 { 00:20:54.437 "dma_device_id": "system", 00:20:54.437 "dma_device_type": 1 00:20:54.437 }, 00:20:54.437 { 00:20:54.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.437 "dma_device_type": 2 00:20:54.437 } 00:20:54.437 ], 00:20:54.437 "driver_specific": {} 00:20:54.437 }' 00:20:54.437 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.437 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.437 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:54.437 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.437 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.438 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:54.438 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.695 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.695 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:54.695 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.695 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.695 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.695 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.695 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:54.695 10:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.953 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.953 "name": "BaseBdev2", 00:20:54.953 "aliases": [ 00:20:54.953 "665db6b8-18d3-4e47-9680-ce4535d0650e" 00:20:54.953 ], 00:20:54.953 "product_name": "Malloc disk", 00:20:54.953 "block_size": 512, 00:20:54.953 "num_blocks": 65536, 00:20:54.953 "uuid": "665db6b8-18d3-4e47-9680-ce4535d0650e", 00:20:54.953 "assigned_rate_limits": { 00:20:54.953 "rw_ios_per_sec": 0, 00:20:54.953 "rw_mbytes_per_sec": 0, 00:20:54.953 "r_mbytes_per_sec": 0, 00:20:54.953 "w_mbytes_per_sec": 0 00:20:54.953 }, 00:20:54.953 "claimed": true, 00:20:54.953 "claim_type": "exclusive_write", 00:20:54.953 "zoned": false, 00:20:54.953 "supported_io_types": { 00:20:54.953 "read": true, 00:20:54.953 "write": true, 00:20:54.953 "unmap": true, 00:20:54.953 "flush": true, 00:20:54.953 "reset": true, 00:20:54.953 "nvme_admin": false, 00:20:54.953 "nvme_io": false, 00:20:54.953 "nvme_io_md": false, 00:20:54.953 "write_zeroes": true, 00:20:54.953 "zcopy": true, 00:20:54.953 "get_zone_info": false, 00:20:54.953 "zone_management": false, 00:20:54.954 "zone_append": false, 00:20:54.954 "compare": false, 00:20:54.954 "compare_and_write": false, 00:20:54.954 "abort": true, 00:20:54.954 "seek_hole": false, 00:20:54.954 "seek_data": false, 00:20:54.954 "copy": true, 00:20:54.954 "nvme_iov_md": false 00:20:54.954 }, 00:20:54.954 "memory_domains": [ 00:20:54.954 { 00:20:54.954 "dma_device_id": "system", 00:20:54.954 "dma_device_type": 1 00:20:54.954 }, 00:20:54.954 { 00:20:54.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.954 "dma_device_type": 2 00:20:54.954 } 00:20:54.954 ], 00:20:54.954 "driver_specific": {} 00:20:54.954 }' 00:20:54.954 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.954 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.954 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:54.954 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.954 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:55.211 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:55.470 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:55.470 "name": "BaseBdev3", 00:20:55.470 "aliases": [ 00:20:55.470 "442f6b7e-08ae-4fbb-a699-bfcdb11e92ad" 00:20:55.470 ], 00:20:55.470 "product_name": "Malloc disk", 00:20:55.470 "block_size": 512, 00:20:55.470 "num_blocks": 65536, 00:20:55.470 "uuid": "442f6b7e-08ae-4fbb-a699-bfcdb11e92ad", 00:20:55.470 "assigned_rate_limits": { 00:20:55.470 "rw_ios_per_sec": 0, 00:20:55.470 "rw_mbytes_per_sec": 0, 00:20:55.470 "r_mbytes_per_sec": 0, 00:20:55.470 "w_mbytes_per_sec": 0 00:20:55.470 }, 00:20:55.470 "claimed": true, 00:20:55.470 "claim_type": "exclusive_write", 00:20:55.470 "zoned": false, 00:20:55.470 "supported_io_types": { 00:20:55.470 "read": true, 00:20:55.470 "write": true, 00:20:55.470 "unmap": true, 00:20:55.470 "flush": true, 00:20:55.470 "reset": true, 00:20:55.470 "nvme_admin": false, 00:20:55.470 "nvme_io": false, 00:20:55.470 "nvme_io_md": false, 00:20:55.470 "write_zeroes": true, 00:20:55.470 "zcopy": true, 00:20:55.470 "get_zone_info": false, 00:20:55.470 "zone_management": false, 00:20:55.470 "zone_append": false, 00:20:55.470 "compare": false, 00:20:55.470 "compare_and_write": false, 00:20:55.470 "abort": true, 00:20:55.470 "seek_hole": false, 00:20:55.470 "seek_data": false, 00:20:55.470 "copy": true, 00:20:55.470 "nvme_iov_md": false 00:20:55.470 }, 00:20:55.470 "memory_domains": [ 00:20:55.470 { 00:20:55.470 "dma_device_id": "system", 00:20:55.470 "dma_device_type": 1 00:20:55.470 }, 00:20:55.470 { 00:20:55.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.470 "dma_device_type": 2 00:20:55.470 } 00:20:55.470 ], 00:20:55.470 "driver_specific": {} 00:20:55.470 }' 00:20:55.470 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.470 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.728 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.728 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.728 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.728 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.728 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.728 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.728 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.728 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.986 10:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.986 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.986 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:55.986 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:55.986 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:56.245 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:56.245 "name": "BaseBdev4", 00:20:56.245 "aliases": [ 00:20:56.245 "71a641fe-f6c8-4d62-9174-70b6467eab8c" 00:20:56.245 ], 00:20:56.245 "product_name": "Malloc disk", 00:20:56.245 "block_size": 512, 00:20:56.245 "num_blocks": 65536, 00:20:56.245 "uuid": "71a641fe-f6c8-4d62-9174-70b6467eab8c", 00:20:56.245 "assigned_rate_limits": { 00:20:56.245 "rw_ios_per_sec": 0, 00:20:56.245 "rw_mbytes_per_sec": 0, 00:20:56.245 "r_mbytes_per_sec": 0, 00:20:56.245 "w_mbytes_per_sec": 0 00:20:56.245 }, 00:20:56.245 "claimed": true, 00:20:56.245 "claim_type": "exclusive_write", 00:20:56.245 "zoned": false, 00:20:56.245 "supported_io_types": { 00:20:56.245 "read": true, 00:20:56.245 "write": true, 00:20:56.245 "unmap": true, 00:20:56.245 "flush": true, 00:20:56.245 "reset": true, 00:20:56.245 "nvme_admin": false, 00:20:56.245 "nvme_io": false, 00:20:56.245 "nvme_io_md": false, 00:20:56.245 "write_zeroes": true, 00:20:56.245 "zcopy": true, 00:20:56.245 "get_zone_info": false, 00:20:56.245 "zone_management": false, 00:20:56.245 "zone_append": false, 00:20:56.245 "compare": false, 00:20:56.245 "compare_and_write": false, 00:20:56.245 "abort": true, 00:20:56.245 "seek_hole": false, 00:20:56.245 "seek_data": false, 00:20:56.245 "copy": true, 00:20:56.245 "nvme_iov_md": false 00:20:56.245 }, 00:20:56.245 "memory_domains": [ 00:20:56.245 { 00:20:56.245 "dma_device_id": "system", 00:20:56.245 "dma_device_type": 1 00:20:56.245 }, 00:20:56.245 { 00:20:56.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.245 "dma_device_type": 2 00:20:56.245 } 00:20:56.245 ], 00:20:56.245 "driver_specific": {} 00:20:56.245 }' 00:20:56.245 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:56.245 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:56.245 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:56.245 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:56.245 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:56.245 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:56.245 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:56.245 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:56.504 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:56.504 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:56.504 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:56.504 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:56.504 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:56.762 [2024-07-12 10:47:31.771639] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.762 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.020 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.020 "name": "Existed_Raid", 00:20:57.020 "uuid": "082bd47a-3c17-45db-b184-baaa058db7a5", 00:20:57.020 "strip_size_kb": 0, 00:20:57.020 "state": "online", 00:20:57.020 "raid_level": "raid1", 00:20:57.020 "superblock": false, 00:20:57.020 "num_base_bdevs": 4, 00:20:57.020 "num_base_bdevs_discovered": 3, 00:20:57.020 "num_base_bdevs_operational": 3, 00:20:57.020 "base_bdevs_list": [ 00:20:57.020 { 00:20:57.020 "name": null, 00:20:57.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.020 "is_configured": false, 00:20:57.020 "data_offset": 0, 00:20:57.020 "data_size": 65536 00:20:57.020 }, 00:20:57.020 { 00:20:57.020 "name": "BaseBdev2", 00:20:57.020 "uuid": "665db6b8-18d3-4e47-9680-ce4535d0650e", 00:20:57.020 "is_configured": true, 00:20:57.020 "data_offset": 0, 00:20:57.021 "data_size": 65536 00:20:57.021 }, 00:20:57.021 { 00:20:57.021 "name": "BaseBdev3", 00:20:57.021 "uuid": "442f6b7e-08ae-4fbb-a699-bfcdb11e92ad", 00:20:57.021 "is_configured": true, 00:20:57.021 "data_offset": 0, 00:20:57.021 "data_size": 65536 00:20:57.021 }, 00:20:57.021 { 00:20:57.021 "name": "BaseBdev4", 00:20:57.021 "uuid": "71a641fe-f6c8-4d62-9174-70b6467eab8c", 00:20:57.021 "is_configured": true, 00:20:57.021 "data_offset": 0, 00:20:57.021 "data_size": 65536 00:20:57.021 } 00:20:57.021 ] 00:20:57.021 }' 00:20:57.021 10:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.021 10:47:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:57.588 10:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:57.588 10:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:57.589 10:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.589 10:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:57.847 10:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:57.847 10:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:57.847 10:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:57.847 [2024-07-12 10:47:33.004807] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:57.847 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:57.847 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:57.847 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.847 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:58.104 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:58.104 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:58.104 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:58.361 [2024-07-12 10:47:33.506586] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:58.361 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:58.361 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:58.361 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.361 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:58.619 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:58.619 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:58.619 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:58.906 [2024-07-12 10:47:33.934311] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:58.906 [2024-07-12 10:47:33.934393] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:58.906 [2024-07-12 10:47:33.947085] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:58.906 [2024-07-12 10:47:33.947122] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:58.906 [2024-07-12 10:47:33.947134] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8b8350 name Existed_Raid, state offline 00:20:58.906 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:58.906 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:58.906 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.906 10:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:59.474 BaseBdev2 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:59.474 10:47:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:59.733 10:47:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:59.991 [ 00:20:59.992 { 00:20:59.992 "name": "BaseBdev2", 00:20:59.992 "aliases": [ 00:20:59.992 "2dd64c31-0db8-4838-8fce-ad0ae416abb8" 00:20:59.992 ], 00:20:59.992 "product_name": "Malloc disk", 00:20:59.992 "block_size": 512, 00:20:59.992 "num_blocks": 65536, 00:20:59.992 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:20:59.992 "assigned_rate_limits": { 00:20:59.992 "rw_ios_per_sec": 0, 00:20:59.992 "rw_mbytes_per_sec": 0, 00:20:59.992 "r_mbytes_per_sec": 0, 00:20:59.992 "w_mbytes_per_sec": 0 00:20:59.992 }, 00:20:59.992 "claimed": false, 00:20:59.992 "zoned": false, 00:20:59.992 "supported_io_types": { 00:20:59.992 "read": true, 00:20:59.992 "write": true, 00:20:59.992 "unmap": true, 00:20:59.992 "flush": true, 00:20:59.992 "reset": true, 00:20:59.992 "nvme_admin": false, 00:20:59.992 "nvme_io": false, 00:20:59.992 "nvme_io_md": false, 00:20:59.992 "write_zeroes": true, 00:20:59.992 "zcopy": true, 00:20:59.992 "get_zone_info": false, 00:20:59.992 "zone_management": false, 00:20:59.992 "zone_append": false, 00:20:59.992 "compare": false, 00:20:59.992 "compare_and_write": false, 00:20:59.992 "abort": true, 00:20:59.992 "seek_hole": false, 00:20:59.992 "seek_data": false, 00:20:59.992 "copy": true, 00:20:59.992 "nvme_iov_md": false 00:20:59.992 }, 00:20:59.992 "memory_domains": [ 00:20:59.992 { 00:20:59.992 "dma_device_id": "system", 00:20:59.992 "dma_device_type": 1 00:20:59.992 }, 00:20:59.992 { 00:20:59.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.992 "dma_device_type": 2 00:20:59.992 } 00:20:59.992 ], 00:20:59.992 "driver_specific": {} 00:20:59.992 } 00:20:59.992 ] 00:20:59.992 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:59.992 10:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:59.992 10:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:59.992 10:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:00.250 BaseBdev3 00:21:00.251 10:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:00.251 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:00.251 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:00.251 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:00.251 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:00.251 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:00.251 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:00.509 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:00.769 [ 00:21:00.769 { 00:21:00.769 "name": "BaseBdev3", 00:21:00.769 "aliases": [ 00:21:00.769 "a6e11891-afe3-4672-a800-92577f9ecd4c" 00:21:00.769 ], 00:21:00.769 "product_name": "Malloc disk", 00:21:00.769 "block_size": 512, 00:21:00.769 "num_blocks": 65536, 00:21:00.769 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:00.769 "assigned_rate_limits": { 00:21:00.769 "rw_ios_per_sec": 0, 00:21:00.769 "rw_mbytes_per_sec": 0, 00:21:00.769 "r_mbytes_per_sec": 0, 00:21:00.769 "w_mbytes_per_sec": 0 00:21:00.769 }, 00:21:00.769 "claimed": false, 00:21:00.769 "zoned": false, 00:21:00.769 "supported_io_types": { 00:21:00.769 "read": true, 00:21:00.769 "write": true, 00:21:00.769 "unmap": true, 00:21:00.769 "flush": true, 00:21:00.769 "reset": true, 00:21:00.769 "nvme_admin": false, 00:21:00.769 "nvme_io": false, 00:21:00.769 "nvme_io_md": false, 00:21:00.769 "write_zeroes": true, 00:21:00.769 "zcopy": true, 00:21:00.769 "get_zone_info": false, 00:21:00.769 "zone_management": false, 00:21:00.769 "zone_append": false, 00:21:00.769 "compare": false, 00:21:00.769 "compare_and_write": false, 00:21:00.769 "abort": true, 00:21:00.769 "seek_hole": false, 00:21:00.769 "seek_data": false, 00:21:00.769 "copy": true, 00:21:00.769 "nvme_iov_md": false 00:21:00.769 }, 00:21:00.769 "memory_domains": [ 00:21:00.769 { 00:21:00.769 "dma_device_id": "system", 00:21:00.769 "dma_device_type": 1 00:21:00.769 }, 00:21:00.769 { 00:21:00.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.769 "dma_device_type": 2 00:21:00.769 } 00:21:00.769 ], 00:21:00.769 "driver_specific": {} 00:21:00.769 } 00:21:00.769 ] 00:21:00.769 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:00.769 10:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:00.769 10:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:00.769 10:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:01.028 BaseBdev4 00:21:01.028 10:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:01.028 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:01.028 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:01.028 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:01.028 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:01.028 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:01.028 10:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:01.028 10:47:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:01.287 [ 00:21:01.287 { 00:21:01.287 "name": "BaseBdev4", 00:21:01.287 "aliases": [ 00:21:01.287 "26ec77a4-7213-4161-b72b-a71b1ce918b0" 00:21:01.287 ], 00:21:01.287 "product_name": "Malloc disk", 00:21:01.287 "block_size": 512, 00:21:01.287 "num_blocks": 65536, 00:21:01.287 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:01.287 "assigned_rate_limits": { 00:21:01.287 "rw_ios_per_sec": 0, 00:21:01.287 "rw_mbytes_per_sec": 0, 00:21:01.287 "r_mbytes_per_sec": 0, 00:21:01.287 "w_mbytes_per_sec": 0 00:21:01.287 }, 00:21:01.287 "claimed": false, 00:21:01.287 "zoned": false, 00:21:01.287 "supported_io_types": { 00:21:01.287 "read": true, 00:21:01.287 "write": true, 00:21:01.287 "unmap": true, 00:21:01.287 "flush": true, 00:21:01.287 "reset": true, 00:21:01.287 "nvme_admin": false, 00:21:01.287 "nvme_io": false, 00:21:01.287 "nvme_io_md": false, 00:21:01.287 "write_zeroes": true, 00:21:01.287 "zcopy": true, 00:21:01.287 "get_zone_info": false, 00:21:01.287 "zone_management": false, 00:21:01.287 "zone_append": false, 00:21:01.287 "compare": false, 00:21:01.287 "compare_and_write": false, 00:21:01.287 "abort": true, 00:21:01.287 "seek_hole": false, 00:21:01.287 "seek_data": false, 00:21:01.287 "copy": true, 00:21:01.287 "nvme_iov_md": false 00:21:01.287 }, 00:21:01.287 "memory_domains": [ 00:21:01.287 { 00:21:01.287 "dma_device_id": "system", 00:21:01.287 "dma_device_type": 1 00:21:01.287 }, 00:21:01.287 { 00:21:01.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.287 "dma_device_type": 2 00:21:01.287 } 00:21:01.287 ], 00:21:01.287 "driver_specific": {} 00:21:01.287 } 00:21:01.287 ] 00:21:01.287 10:47:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:01.287 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:01.287 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:01.287 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:01.547 [2024-07-12 10:47:36.588304] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:01.547 [2024-07-12 10:47:36.588349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:01.547 [2024-07-12 10:47:36.588368] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:01.547 [2024-07-12 10:47:36.589760] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:01.547 [2024-07-12 10:47:36.589803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.547 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:01.806 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.806 "name": "Existed_Raid", 00:21:01.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.806 "strip_size_kb": 0, 00:21:01.806 "state": "configuring", 00:21:01.806 "raid_level": "raid1", 00:21:01.806 "superblock": false, 00:21:01.806 "num_base_bdevs": 4, 00:21:01.806 "num_base_bdevs_discovered": 3, 00:21:01.806 "num_base_bdevs_operational": 4, 00:21:01.806 "base_bdevs_list": [ 00:21:01.806 { 00:21:01.806 "name": "BaseBdev1", 00:21:01.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.806 "is_configured": false, 00:21:01.806 "data_offset": 0, 00:21:01.806 "data_size": 0 00:21:01.806 }, 00:21:01.806 { 00:21:01.806 "name": "BaseBdev2", 00:21:01.806 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:01.806 "is_configured": true, 00:21:01.806 "data_offset": 0, 00:21:01.806 "data_size": 65536 00:21:01.806 }, 00:21:01.806 { 00:21:01.806 "name": "BaseBdev3", 00:21:01.806 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:01.806 "is_configured": true, 00:21:01.806 "data_offset": 0, 00:21:01.806 "data_size": 65536 00:21:01.806 }, 00:21:01.806 { 00:21:01.806 "name": "BaseBdev4", 00:21:01.806 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:01.806 "is_configured": true, 00:21:01.806 "data_offset": 0, 00:21:01.806 "data_size": 65536 00:21:01.806 } 00:21:01.806 ] 00:21:01.806 }' 00:21:01.806 10:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.806 10:47:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:02.800 [2024-07-12 10:47:37.899762] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.800 10:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:03.059 10:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.059 "name": "Existed_Raid", 00:21:03.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.060 "strip_size_kb": 0, 00:21:03.060 "state": "configuring", 00:21:03.060 "raid_level": "raid1", 00:21:03.060 "superblock": false, 00:21:03.060 "num_base_bdevs": 4, 00:21:03.060 "num_base_bdevs_discovered": 2, 00:21:03.060 "num_base_bdevs_operational": 4, 00:21:03.060 "base_bdevs_list": [ 00:21:03.060 { 00:21:03.060 "name": "BaseBdev1", 00:21:03.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.060 "is_configured": false, 00:21:03.060 "data_offset": 0, 00:21:03.060 "data_size": 0 00:21:03.060 }, 00:21:03.060 { 00:21:03.060 "name": null, 00:21:03.060 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:03.060 "is_configured": false, 00:21:03.060 "data_offset": 0, 00:21:03.060 "data_size": 65536 00:21:03.060 }, 00:21:03.060 { 00:21:03.060 "name": "BaseBdev3", 00:21:03.060 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:03.060 "is_configured": true, 00:21:03.060 "data_offset": 0, 00:21:03.060 "data_size": 65536 00:21:03.060 }, 00:21:03.060 { 00:21:03.060 "name": "BaseBdev4", 00:21:03.060 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:03.060 "is_configured": true, 00:21:03.060 "data_offset": 0, 00:21:03.060 "data_size": 65536 00:21:03.060 } 00:21:03.060 ] 00:21:03.060 }' 00:21:03.060 10:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.060 10:47:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:03.627 10:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.627 10:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:03.885 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:03.885 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:04.144 [2024-07-12 10:47:39.246885] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:04.144 BaseBdev1 00:21:04.144 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:04.144 10:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:04.144 10:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:04.144 10:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:04.144 10:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:04.144 10:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:04.144 10:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:04.403 10:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:04.662 [ 00:21:04.662 { 00:21:04.662 "name": "BaseBdev1", 00:21:04.662 "aliases": [ 00:21:04.662 "5e34116a-472a-47e2-b065-bfde00f1c399" 00:21:04.662 ], 00:21:04.662 "product_name": "Malloc disk", 00:21:04.662 "block_size": 512, 00:21:04.662 "num_blocks": 65536, 00:21:04.662 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:04.662 "assigned_rate_limits": { 00:21:04.662 "rw_ios_per_sec": 0, 00:21:04.662 "rw_mbytes_per_sec": 0, 00:21:04.662 "r_mbytes_per_sec": 0, 00:21:04.662 "w_mbytes_per_sec": 0 00:21:04.662 }, 00:21:04.662 "claimed": true, 00:21:04.662 "claim_type": "exclusive_write", 00:21:04.662 "zoned": false, 00:21:04.662 "supported_io_types": { 00:21:04.662 "read": true, 00:21:04.662 "write": true, 00:21:04.662 "unmap": true, 00:21:04.662 "flush": true, 00:21:04.662 "reset": true, 00:21:04.662 "nvme_admin": false, 00:21:04.662 "nvme_io": false, 00:21:04.662 "nvme_io_md": false, 00:21:04.662 "write_zeroes": true, 00:21:04.662 "zcopy": true, 00:21:04.662 "get_zone_info": false, 00:21:04.662 "zone_management": false, 00:21:04.662 "zone_append": false, 00:21:04.662 "compare": false, 00:21:04.662 "compare_and_write": false, 00:21:04.662 "abort": true, 00:21:04.662 "seek_hole": false, 00:21:04.662 "seek_data": false, 00:21:04.662 "copy": true, 00:21:04.662 "nvme_iov_md": false 00:21:04.662 }, 00:21:04.662 "memory_domains": [ 00:21:04.662 { 00:21:04.662 "dma_device_id": "system", 00:21:04.662 "dma_device_type": 1 00:21:04.662 }, 00:21:04.662 { 00:21:04.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.662 "dma_device_type": 2 00:21:04.662 } 00:21:04.662 ], 00:21:04.662 "driver_specific": {} 00:21:04.662 } 00:21:04.662 ] 00:21:04.662 10:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.663 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:04.922 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.922 "name": "Existed_Raid", 00:21:04.922 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.922 "strip_size_kb": 0, 00:21:04.922 "state": "configuring", 00:21:04.922 "raid_level": "raid1", 00:21:04.922 "superblock": false, 00:21:04.922 "num_base_bdevs": 4, 00:21:04.922 "num_base_bdevs_discovered": 3, 00:21:04.922 "num_base_bdevs_operational": 4, 00:21:04.922 "base_bdevs_list": [ 00:21:04.922 { 00:21:04.922 "name": "BaseBdev1", 00:21:04.922 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:04.922 "is_configured": true, 00:21:04.922 "data_offset": 0, 00:21:04.922 "data_size": 65536 00:21:04.922 }, 00:21:04.922 { 00:21:04.922 "name": null, 00:21:04.922 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:04.922 "is_configured": false, 00:21:04.922 "data_offset": 0, 00:21:04.922 "data_size": 65536 00:21:04.922 }, 00:21:04.922 { 00:21:04.922 "name": "BaseBdev3", 00:21:04.922 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:04.922 "is_configured": true, 00:21:04.922 "data_offset": 0, 00:21:04.922 "data_size": 65536 00:21:04.922 }, 00:21:04.922 { 00:21:04.922 "name": "BaseBdev4", 00:21:04.922 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:04.922 "is_configured": true, 00:21:04.922 "data_offset": 0, 00:21:04.922 "data_size": 65536 00:21:04.922 } 00:21:04.922 ] 00:21:04.922 }' 00:21:04.922 10:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.922 10:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:05.490 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.490 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:05.490 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:05.490 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:05.749 [2024-07-12 10:47:40.819067] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.749 10:47:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:06.317 10:47:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:06.317 "name": "Existed_Raid", 00:21:06.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.317 "strip_size_kb": 0, 00:21:06.317 "state": "configuring", 00:21:06.317 "raid_level": "raid1", 00:21:06.317 "superblock": false, 00:21:06.317 "num_base_bdevs": 4, 00:21:06.317 "num_base_bdevs_discovered": 2, 00:21:06.317 "num_base_bdevs_operational": 4, 00:21:06.317 "base_bdevs_list": [ 00:21:06.317 { 00:21:06.317 "name": "BaseBdev1", 00:21:06.317 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:06.317 "is_configured": true, 00:21:06.317 "data_offset": 0, 00:21:06.317 "data_size": 65536 00:21:06.317 }, 00:21:06.317 { 00:21:06.317 "name": null, 00:21:06.317 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:06.317 "is_configured": false, 00:21:06.317 "data_offset": 0, 00:21:06.317 "data_size": 65536 00:21:06.317 }, 00:21:06.317 { 00:21:06.317 "name": null, 00:21:06.317 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:06.317 "is_configured": false, 00:21:06.317 "data_offset": 0, 00:21:06.317 "data_size": 65536 00:21:06.317 }, 00:21:06.317 { 00:21:06.317 "name": "BaseBdev4", 00:21:06.317 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:06.317 "is_configured": true, 00:21:06.317 "data_offset": 0, 00:21:06.317 "data_size": 65536 00:21:06.317 } 00:21:06.317 ] 00:21:06.317 }' 00:21:06.317 10:47:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:06.317 10:47:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:06.885 10:47:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.885 10:47:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:07.144 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:07.144 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:07.403 [2024-07-12 10:47:42.339140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.403 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.404 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:07.974 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:07.974 "name": "Existed_Raid", 00:21:07.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.974 "strip_size_kb": 0, 00:21:07.974 "state": "configuring", 00:21:07.974 "raid_level": "raid1", 00:21:07.974 "superblock": false, 00:21:07.974 "num_base_bdevs": 4, 00:21:07.974 "num_base_bdevs_discovered": 3, 00:21:07.974 "num_base_bdevs_operational": 4, 00:21:07.975 "base_bdevs_list": [ 00:21:07.975 { 00:21:07.975 "name": "BaseBdev1", 00:21:07.975 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:07.975 "is_configured": true, 00:21:07.975 "data_offset": 0, 00:21:07.975 "data_size": 65536 00:21:07.975 }, 00:21:07.975 { 00:21:07.975 "name": null, 00:21:07.975 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:07.975 "is_configured": false, 00:21:07.975 "data_offset": 0, 00:21:07.975 "data_size": 65536 00:21:07.975 }, 00:21:07.975 { 00:21:07.975 "name": "BaseBdev3", 00:21:07.975 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:07.975 "is_configured": true, 00:21:07.975 "data_offset": 0, 00:21:07.975 "data_size": 65536 00:21:07.975 }, 00:21:07.975 { 00:21:07.975 "name": "BaseBdev4", 00:21:07.975 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:07.975 "is_configured": true, 00:21:07.975 "data_offset": 0, 00:21:07.975 "data_size": 65536 00:21:07.975 } 00:21:07.975 ] 00:21:07.975 }' 00:21:07.975 10:47:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:07.975 10:47:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:08.542 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.542 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:08.542 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:08.542 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:08.801 [2024-07-12 10:47:43.871195] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.801 10:47:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:09.061 10:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:09.061 "name": "Existed_Raid", 00:21:09.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.061 "strip_size_kb": 0, 00:21:09.061 "state": "configuring", 00:21:09.061 "raid_level": "raid1", 00:21:09.061 "superblock": false, 00:21:09.061 "num_base_bdevs": 4, 00:21:09.061 "num_base_bdevs_discovered": 2, 00:21:09.061 "num_base_bdevs_operational": 4, 00:21:09.061 "base_bdevs_list": [ 00:21:09.061 { 00:21:09.061 "name": null, 00:21:09.061 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:09.061 "is_configured": false, 00:21:09.061 "data_offset": 0, 00:21:09.061 "data_size": 65536 00:21:09.061 }, 00:21:09.061 { 00:21:09.061 "name": null, 00:21:09.061 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:09.061 "is_configured": false, 00:21:09.061 "data_offset": 0, 00:21:09.061 "data_size": 65536 00:21:09.061 }, 00:21:09.061 { 00:21:09.061 "name": "BaseBdev3", 00:21:09.061 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:09.061 "is_configured": true, 00:21:09.061 "data_offset": 0, 00:21:09.061 "data_size": 65536 00:21:09.061 }, 00:21:09.061 { 00:21:09.061 "name": "BaseBdev4", 00:21:09.061 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:09.061 "is_configured": true, 00:21:09.061 "data_offset": 0, 00:21:09.061 "data_size": 65536 00:21:09.061 } 00:21:09.061 ] 00:21:09.061 }' 00:21:09.061 10:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:09.061 10:47:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:09.628 10:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.628 10:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:09.887 10:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:09.887 10:47:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:10.146 [2024-07-12 10:47:45.183229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.146 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.404 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.404 "name": "Existed_Raid", 00:21:10.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.404 "strip_size_kb": 0, 00:21:10.404 "state": "configuring", 00:21:10.404 "raid_level": "raid1", 00:21:10.404 "superblock": false, 00:21:10.404 "num_base_bdevs": 4, 00:21:10.404 "num_base_bdevs_discovered": 3, 00:21:10.404 "num_base_bdevs_operational": 4, 00:21:10.404 "base_bdevs_list": [ 00:21:10.404 { 00:21:10.404 "name": null, 00:21:10.404 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:10.404 "is_configured": false, 00:21:10.404 "data_offset": 0, 00:21:10.404 "data_size": 65536 00:21:10.404 }, 00:21:10.404 { 00:21:10.404 "name": "BaseBdev2", 00:21:10.404 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:10.404 "is_configured": true, 00:21:10.404 "data_offset": 0, 00:21:10.404 "data_size": 65536 00:21:10.404 }, 00:21:10.405 { 00:21:10.405 "name": "BaseBdev3", 00:21:10.405 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:10.405 "is_configured": true, 00:21:10.405 "data_offset": 0, 00:21:10.405 "data_size": 65536 00:21:10.405 }, 00:21:10.405 { 00:21:10.405 "name": "BaseBdev4", 00:21:10.405 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:10.405 "is_configured": true, 00:21:10.405 "data_offset": 0, 00:21:10.405 "data_size": 65536 00:21:10.405 } 00:21:10.405 ] 00:21:10.405 }' 00:21:10.405 10:47:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.405 10:47:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:10.971 10:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.971 10:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:11.230 10:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:11.230 10:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.230 10:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:11.796 10:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 5e34116a-472a-47e2-b065-bfde00f1c399 00:21:11.796 [2024-07-12 10:47:46.967399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:11.796 [2024-07-12 10:47:46.967443] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8b6610 00:21:11.796 [2024-07-12 10:47:46.967452] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:11.796 [2024-07-12 10:47:46.967654] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8b7a70 00:21:11.796 [2024-07-12 10:47:46.967782] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8b6610 00:21:11.796 [2024-07-12 10:47:46.967792] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8b6610 00:21:11.796 [2024-07-12 10:47:46.967960] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:11.796 NewBaseBdev 00:21:12.054 10:47:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:12.054 10:47:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:12.054 10:47:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:12.054 10:47:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:12.054 10:47:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:12.054 10:47:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:12.054 10:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:12.054 10:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:12.312 [ 00:21:12.312 { 00:21:12.312 "name": "NewBaseBdev", 00:21:12.312 "aliases": [ 00:21:12.312 "5e34116a-472a-47e2-b065-bfde00f1c399" 00:21:12.312 ], 00:21:12.312 "product_name": "Malloc disk", 00:21:12.312 "block_size": 512, 00:21:12.312 "num_blocks": 65536, 00:21:12.312 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:12.312 "assigned_rate_limits": { 00:21:12.312 "rw_ios_per_sec": 0, 00:21:12.312 "rw_mbytes_per_sec": 0, 00:21:12.312 "r_mbytes_per_sec": 0, 00:21:12.312 "w_mbytes_per_sec": 0 00:21:12.312 }, 00:21:12.312 "claimed": true, 00:21:12.312 "claim_type": "exclusive_write", 00:21:12.312 "zoned": false, 00:21:12.312 "supported_io_types": { 00:21:12.312 "read": true, 00:21:12.312 "write": true, 00:21:12.312 "unmap": true, 00:21:12.312 "flush": true, 00:21:12.312 "reset": true, 00:21:12.312 "nvme_admin": false, 00:21:12.312 "nvme_io": false, 00:21:12.312 "nvme_io_md": false, 00:21:12.312 "write_zeroes": true, 00:21:12.312 "zcopy": true, 00:21:12.312 "get_zone_info": false, 00:21:12.312 "zone_management": false, 00:21:12.312 "zone_append": false, 00:21:12.312 "compare": false, 00:21:12.312 "compare_and_write": false, 00:21:12.312 "abort": true, 00:21:12.312 "seek_hole": false, 00:21:12.312 "seek_data": false, 00:21:12.312 "copy": true, 00:21:12.312 "nvme_iov_md": false 00:21:12.312 }, 00:21:12.312 "memory_domains": [ 00:21:12.312 { 00:21:12.312 "dma_device_id": "system", 00:21:12.312 "dma_device_type": 1 00:21:12.312 }, 00:21:12.312 { 00:21:12.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.312 "dma_device_type": 2 00:21:12.312 } 00:21:12.312 ], 00:21:12.312 "driver_specific": {} 00:21:12.312 } 00:21:12.312 ] 00:21:12.312 10:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.313 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:12.571 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.571 "name": "Existed_Raid", 00:21:12.571 "uuid": "e39a81e3-db1b-4714-8589-330b8aec6f7f", 00:21:12.571 "strip_size_kb": 0, 00:21:12.571 "state": "online", 00:21:12.571 "raid_level": "raid1", 00:21:12.571 "superblock": false, 00:21:12.571 "num_base_bdevs": 4, 00:21:12.571 "num_base_bdevs_discovered": 4, 00:21:12.571 "num_base_bdevs_operational": 4, 00:21:12.571 "base_bdevs_list": [ 00:21:12.571 { 00:21:12.571 "name": "NewBaseBdev", 00:21:12.571 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:12.571 "is_configured": true, 00:21:12.571 "data_offset": 0, 00:21:12.571 "data_size": 65536 00:21:12.571 }, 00:21:12.571 { 00:21:12.571 "name": "BaseBdev2", 00:21:12.571 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:12.571 "is_configured": true, 00:21:12.571 "data_offset": 0, 00:21:12.571 "data_size": 65536 00:21:12.571 }, 00:21:12.571 { 00:21:12.571 "name": "BaseBdev3", 00:21:12.571 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:12.571 "is_configured": true, 00:21:12.571 "data_offset": 0, 00:21:12.571 "data_size": 65536 00:21:12.571 }, 00:21:12.571 { 00:21:12.571 "name": "BaseBdev4", 00:21:12.571 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:12.571 "is_configured": true, 00:21:12.571 "data_offset": 0, 00:21:12.571 "data_size": 65536 00:21:12.571 } 00:21:12.571 ] 00:21:12.571 }' 00:21:12.571 10:47:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.571 10:47:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.137 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:13.137 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:13.137 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:13.137 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:13.137 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:13.137 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:13.137 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:13.137 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:13.397 [2024-07-12 10:47:48.403547] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:13.397 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:13.397 "name": "Existed_Raid", 00:21:13.397 "aliases": [ 00:21:13.397 "e39a81e3-db1b-4714-8589-330b8aec6f7f" 00:21:13.397 ], 00:21:13.397 "product_name": "Raid Volume", 00:21:13.397 "block_size": 512, 00:21:13.397 "num_blocks": 65536, 00:21:13.397 "uuid": "e39a81e3-db1b-4714-8589-330b8aec6f7f", 00:21:13.397 "assigned_rate_limits": { 00:21:13.397 "rw_ios_per_sec": 0, 00:21:13.397 "rw_mbytes_per_sec": 0, 00:21:13.397 "r_mbytes_per_sec": 0, 00:21:13.397 "w_mbytes_per_sec": 0 00:21:13.397 }, 00:21:13.397 "claimed": false, 00:21:13.397 "zoned": false, 00:21:13.397 "supported_io_types": { 00:21:13.397 "read": true, 00:21:13.397 "write": true, 00:21:13.397 "unmap": false, 00:21:13.397 "flush": false, 00:21:13.397 "reset": true, 00:21:13.397 "nvme_admin": false, 00:21:13.397 "nvme_io": false, 00:21:13.397 "nvme_io_md": false, 00:21:13.397 "write_zeroes": true, 00:21:13.397 "zcopy": false, 00:21:13.397 "get_zone_info": false, 00:21:13.397 "zone_management": false, 00:21:13.397 "zone_append": false, 00:21:13.397 "compare": false, 00:21:13.397 "compare_and_write": false, 00:21:13.397 "abort": false, 00:21:13.397 "seek_hole": false, 00:21:13.397 "seek_data": false, 00:21:13.397 "copy": false, 00:21:13.397 "nvme_iov_md": false 00:21:13.397 }, 00:21:13.397 "memory_domains": [ 00:21:13.397 { 00:21:13.397 "dma_device_id": "system", 00:21:13.397 "dma_device_type": 1 00:21:13.397 }, 00:21:13.397 { 00:21:13.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.397 "dma_device_type": 2 00:21:13.397 }, 00:21:13.397 { 00:21:13.397 "dma_device_id": "system", 00:21:13.397 "dma_device_type": 1 00:21:13.397 }, 00:21:13.397 { 00:21:13.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.397 "dma_device_type": 2 00:21:13.397 }, 00:21:13.397 { 00:21:13.397 "dma_device_id": "system", 00:21:13.397 "dma_device_type": 1 00:21:13.397 }, 00:21:13.397 { 00:21:13.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.397 "dma_device_type": 2 00:21:13.397 }, 00:21:13.397 { 00:21:13.397 "dma_device_id": "system", 00:21:13.397 "dma_device_type": 1 00:21:13.397 }, 00:21:13.397 { 00:21:13.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.397 "dma_device_type": 2 00:21:13.397 } 00:21:13.397 ], 00:21:13.397 "driver_specific": { 00:21:13.397 "raid": { 00:21:13.397 "uuid": "e39a81e3-db1b-4714-8589-330b8aec6f7f", 00:21:13.397 "strip_size_kb": 0, 00:21:13.397 "state": "online", 00:21:13.397 "raid_level": "raid1", 00:21:13.397 "superblock": false, 00:21:13.397 "num_base_bdevs": 4, 00:21:13.397 "num_base_bdevs_discovered": 4, 00:21:13.397 "num_base_bdevs_operational": 4, 00:21:13.397 "base_bdevs_list": [ 00:21:13.397 { 00:21:13.397 "name": "NewBaseBdev", 00:21:13.397 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:13.397 "is_configured": true, 00:21:13.397 "data_offset": 0, 00:21:13.397 "data_size": 65536 00:21:13.397 }, 00:21:13.397 { 00:21:13.397 "name": "BaseBdev2", 00:21:13.397 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:13.397 "is_configured": true, 00:21:13.397 "data_offset": 0, 00:21:13.397 "data_size": 65536 00:21:13.397 }, 00:21:13.397 { 00:21:13.397 "name": "BaseBdev3", 00:21:13.397 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:13.397 "is_configured": true, 00:21:13.397 "data_offset": 0, 00:21:13.398 "data_size": 65536 00:21:13.398 }, 00:21:13.398 { 00:21:13.398 "name": "BaseBdev4", 00:21:13.398 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:13.398 "is_configured": true, 00:21:13.398 "data_offset": 0, 00:21:13.398 "data_size": 65536 00:21:13.398 } 00:21:13.398 ] 00:21:13.398 } 00:21:13.398 } 00:21:13.398 }' 00:21:13.398 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:13.398 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:13.398 BaseBdev2 00:21:13.398 BaseBdev3 00:21:13.398 BaseBdev4' 00:21:13.398 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:13.398 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:13.398 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:13.656 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:13.656 "name": "NewBaseBdev", 00:21:13.656 "aliases": [ 00:21:13.656 "5e34116a-472a-47e2-b065-bfde00f1c399" 00:21:13.656 ], 00:21:13.656 "product_name": "Malloc disk", 00:21:13.656 "block_size": 512, 00:21:13.656 "num_blocks": 65536, 00:21:13.656 "uuid": "5e34116a-472a-47e2-b065-bfde00f1c399", 00:21:13.656 "assigned_rate_limits": { 00:21:13.656 "rw_ios_per_sec": 0, 00:21:13.656 "rw_mbytes_per_sec": 0, 00:21:13.656 "r_mbytes_per_sec": 0, 00:21:13.656 "w_mbytes_per_sec": 0 00:21:13.656 }, 00:21:13.656 "claimed": true, 00:21:13.656 "claim_type": "exclusive_write", 00:21:13.656 "zoned": false, 00:21:13.656 "supported_io_types": { 00:21:13.656 "read": true, 00:21:13.656 "write": true, 00:21:13.656 "unmap": true, 00:21:13.656 "flush": true, 00:21:13.656 "reset": true, 00:21:13.656 "nvme_admin": false, 00:21:13.656 "nvme_io": false, 00:21:13.656 "nvme_io_md": false, 00:21:13.656 "write_zeroes": true, 00:21:13.656 "zcopy": true, 00:21:13.656 "get_zone_info": false, 00:21:13.656 "zone_management": false, 00:21:13.656 "zone_append": false, 00:21:13.656 "compare": false, 00:21:13.656 "compare_and_write": false, 00:21:13.656 "abort": true, 00:21:13.656 "seek_hole": false, 00:21:13.656 "seek_data": false, 00:21:13.656 "copy": true, 00:21:13.656 "nvme_iov_md": false 00:21:13.656 }, 00:21:13.656 "memory_domains": [ 00:21:13.656 { 00:21:13.656 "dma_device_id": "system", 00:21:13.656 "dma_device_type": 1 00:21:13.656 }, 00:21:13.656 { 00:21:13.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.656 "dma_device_type": 2 00:21:13.656 } 00:21:13.656 ], 00:21:13.656 "driver_specific": {} 00:21:13.656 }' 00:21:13.656 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.656 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.656 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:13.656 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.914 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.914 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:13.914 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.914 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.914 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:13.914 10:47:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.914 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.914 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:13.914 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:13.914 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:13.914 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:14.172 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:14.172 "name": "BaseBdev2", 00:21:14.172 "aliases": [ 00:21:14.172 "2dd64c31-0db8-4838-8fce-ad0ae416abb8" 00:21:14.172 ], 00:21:14.172 "product_name": "Malloc disk", 00:21:14.172 "block_size": 512, 00:21:14.172 "num_blocks": 65536, 00:21:14.172 "uuid": "2dd64c31-0db8-4838-8fce-ad0ae416abb8", 00:21:14.172 "assigned_rate_limits": { 00:21:14.172 "rw_ios_per_sec": 0, 00:21:14.172 "rw_mbytes_per_sec": 0, 00:21:14.172 "r_mbytes_per_sec": 0, 00:21:14.172 "w_mbytes_per_sec": 0 00:21:14.172 }, 00:21:14.172 "claimed": true, 00:21:14.172 "claim_type": "exclusive_write", 00:21:14.172 "zoned": false, 00:21:14.172 "supported_io_types": { 00:21:14.172 "read": true, 00:21:14.172 "write": true, 00:21:14.172 "unmap": true, 00:21:14.172 "flush": true, 00:21:14.172 "reset": true, 00:21:14.172 "nvme_admin": false, 00:21:14.172 "nvme_io": false, 00:21:14.172 "nvme_io_md": false, 00:21:14.172 "write_zeroes": true, 00:21:14.172 "zcopy": true, 00:21:14.172 "get_zone_info": false, 00:21:14.172 "zone_management": false, 00:21:14.172 "zone_append": false, 00:21:14.172 "compare": false, 00:21:14.172 "compare_and_write": false, 00:21:14.173 "abort": true, 00:21:14.173 "seek_hole": false, 00:21:14.173 "seek_data": false, 00:21:14.173 "copy": true, 00:21:14.173 "nvme_iov_md": false 00:21:14.173 }, 00:21:14.173 "memory_domains": [ 00:21:14.173 { 00:21:14.173 "dma_device_id": "system", 00:21:14.173 "dma_device_type": 1 00:21:14.173 }, 00:21:14.173 { 00:21:14.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.173 "dma_device_type": 2 00:21:14.173 } 00:21:14.173 ], 00:21:14.173 "driver_specific": {} 00:21:14.173 }' 00:21:14.173 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.430 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.430 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:14.430 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.430 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.430 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:14.430 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:14.430 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:14.430 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:14.430 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:14.688 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:14.688 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:14.688 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:14.688 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:14.688 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:14.947 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:14.947 "name": "BaseBdev3", 00:21:14.947 "aliases": [ 00:21:14.947 "a6e11891-afe3-4672-a800-92577f9ecd4c" 00:21:14.947 ], 00:21:14.947 "product_name": "Malloc disk", 00:21:14.947 "block_size": 512, 00:21:14.947 "num_blocks": 65536, 00:21:14.947 "uuid": "a6e11891-afe3-4672-a800-92577f9ecd4c", 00:21:14.947 "assigned_rate_limits": { 00:21:14.947 "rw_ios_per_sec": 0, 00:21:14.947 "rw_mbytes_per_sec": 0, 00:21:14.947 "r_mbytes_per_sec": 0, 00:21:14.947 "w_mbytes_per_sec": 0 00:21:14.947 }, 00:21:14.947 "claimed": true, 00:21:14.947 "claim_type": "exclusive_write", 00:21:14.947 "zoned": false, 00:21:14.947 "supported_io_types": { 00:21:14.947 "read": true, 00:21:14.947 "write": true, 00:21:14.947 "unmap": true, 00:21:14.947 "flush": true, 00:21:14.947 "reset": true, 00:21:14.947 "nvme_admin": false, 00:21:14.947 "nvme_io": false, 00:21:14.947 "nvme_io_md": false, 00:21:14.947 "write_zeroes": true, 00:21:14.947 "zcopy": true, 00:21:14.947 "get_zone_info": false, 00:21:14.947 "zone_management": false, 00:21:14.947 "zone_append": false, 00:21:14.947 "compare": false, 00:21:14.947 "compare_and_write": false, 00:21:14.947 "abort": true, 00:21:14.947 "seek_hole": false, 00:21:14.947 "seek_data": false, 00:21:14.947 "copy": true, 00:21:14.947 "nvme_iov_md": false 00:21:14.947 }, 00:21:14.947 "memory_domains": [ 00:21:14.947 { 00:21:14.947 "dma_device_id": "system", 00:21:14.947 "dma_device_type": 1 00:21:14.947 }, 00:21:14.947 { 00:21:14.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.947 "dma_device_type": 2 00:21:14.947 } 00:21:14.947 ], 00:21:14.947 "driver_specific": {} 00:21:14.947 }' 00:21:14.947 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.947 10:47:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.947 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:14.947 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.947 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.947 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:14.947 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.205 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.205 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:15.205 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.205 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.205 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:15.205 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:15.205 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:15.205 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:15.465 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:15.465 "name": "BaseBdev4", 00:21:15.465 "aliases": [ 00:21:15.465 "26ec77a4-7213-4161-b72b-a71b1ce918b0" 00:21:15.465 ], 00:21:15.465 "product_name": "Malloc disk", 00:21:15.465 "block_size": 512, 00:21:15.465 "num_blocks": 65536, 00:21:15.465 "uuid": "26ec77a4-7213-4161-b72b-a71b1ce918b0", 00:21:15.465 "assigned_rate_limits": { 00:21:15.465 "rw_ios_per_sec": 0, 00:21:15.465 "rw_mbytes_per_sec": 0, 00:21:15.465 "r_mbytes_per_sec": 0, 00:21:15.465 "w_mbytes_per_sec": 0 00:21:15.465 }, 00:21:15.465 "claimed": true, 00:21:15.465 "claim_type": "exclusive_write", 00:21:15.465 "zoned": false, 00:21:15.465 "supported_io_types": { 00:21:15.465 "read": true, 00:21:15.465 "write": true, 00:21:15.465 "unmap": true, 00:21:15.465 "flush": true, 00:21:15.465 "reset": true, 00:21:15.465 "nvme_admin": false, 00:21:15.465 "nvme_io": false, 00:21:15.465 "nvme_io_md": false, 00:21:15.465 "write_zeroes": true, 00:21:15.465 "zcopy": true, 00:21:15.465 "get_zone_info": false, 00:21:15.465 "zone_management": false, 00:21:15.465 "zone_append": false, 00:21:15.465 "compare": false, 00:21:15.465 "compare_and_write": false, 00:21:15.465 "abort": true, 00:21:15.465 "seek_hole": false, 00:21:15.465 "seek_data": false, 00:21:15.465 "copy": true, 00:21:15.465 "nvme_iov_md": false 00:21:15.465 }, 00:21:15.465 "memory_domains": [ 00:21:15.465 { 00:21:15.465 "dma_device_id": "system", 00:21:15.465 "dma_device_type": 1 00:21:15.465 }, 00:21:15.465 { 00:21:15.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.465 "dma_device_type": 2 00:21:15.465 } 00:21:15.465 ], 00:21:15.465 "driver_specific": {} 00:21:15.465 }' 00:21:15.465 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.465 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.465 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:15.465 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.465 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.725 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:15.725 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.725 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.725 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:15.725 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.725 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.725 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:15.725 10:47:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:15.984 [2024-07-12 10:47:51.094364] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:15.984 [2024-07-12 10:47:51.094393] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:15.984 [2024-07-12 10:47:51.094446] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:15.984 [2024-07-12 10:47:51.094721] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:15.984 [2024-07-12 10:47:51.094734] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8b6610 name Existed_Raid, state offline 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2104997 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2104997 ']' 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2104997 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2104997 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2104997' 00:21:15.984 killing process with pid 2104997 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2104997 00:21:15.984 [2024-07-12 10:47:51.162565] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:15.984 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2104997 00:21:16.244 [2024-07-12 10:47:51.205675] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:16.244 10:47:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:16.244 00:21:16.244 real 0m32.434s 00:21:16.244 user 0m59.679s 00:21:16.244 sys 0m5.642s 00:21:16.244 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:16.244 10:47:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:16.244 ************************************ 00:21:16.244 END TEST raid_state_function_test 00:21:16.244 ************************************ 00:21:16.505 10:47:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:16.505 10:47:51 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:21:16.505 10:47:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:16.505 10:47:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:16.505 10:47:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:16.505 ************************************ 00:21:16.505 START TEST raid_state_function_test_sb 00:21:16.505 ************************************ 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2109864 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2109864' 00:21:16.505 Process raid pid: 2109864 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2109864 /var/tmp/spdk-raid.sock 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2109864 ']' 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:16.505 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:16.505 10:47:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:16.505 [2024-07-12 10:47:51.594815] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:21:16.505 [2024-07-12 10:47:51.594896] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:16.832 [2024-07-12 10:47:51.724391] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:16.832 [2024-07-12 10:47:51.824193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:16.832 [2024-07-12 10:47:51.892805] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:16.832 [2024-07-12 10:47:51.892843] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:17.401 10:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:17.401 10:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:17.401 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:17.660 [2024-07-12 10:47:52.751728] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:17.660 [2024-07-12 10:47:52.751772] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:17.660 [2024-07-12 10:47:52.751783] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:17.660 [2024-07-12 10:47:52.751795] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:17.660 [2024-07-12 10:47:52.751804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:17.660 [2024-07-12 10:47:52.751815] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:17.660 [2024-07-12 10:47:52.751823] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:17.660 [2024-07-12 10:47:52.751834] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.660 10:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:17.919 10:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:17.919 "name": "Existed_Raid", 00:21:17.919 "uuid": "9f18bb57-c448-4481-af31-0cc09a4c19ac", 00:21:17.919 "strip_size_kb": 0, 00:21:17.919 "state": "configuring", 00:21:17.919 "raid_level": "raid1", 00:21:17.919 "superblock": true, 00:21:17.919 "num_base_bdevs": 4, 00:21:17.919 "num_base_bdevs_discovered": 0, 00:21:17.919 "num_base_bdevs_operational": 4, 00:21:17.919 "base_bdevs_list": [ 00:21:17.919 { 00:21:17.919 "name": "BaseBdev1", 00:21:17.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:17.919 "is_configured": false, 00:21:17.919 "data_offset": 0, 00:21:17.919 "data_size": 0 00:21:17.919 }, 00:21:17.919 { 00:21:17.919 "name": "BaseBdev2", 00:21:17.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:17.919 "is_configured": false, 00:21:17.919 "data_offset": 0, 00:21:17.919 "data_size": 0 00:21:17.919 }, 00:21:17.919 { 00:21:17.919 "name": "BaseBdev3", 00:21:17.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:17.919 "is_configured": false, 00:21:17.919 "data_offset": 0, 00:21:17.919 "data_size": 0 00:21:17.919 }, 00:21:17.919 { 00:21:17.919 "name": "BaseBdev4", 00:21:17.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:17.919 "is_configured": false, 00:21:17.919 "data_offset": 0, 00:21:17.919 "data_size": 0 00:21:17.919 } 00:21:17.919 ] 00:21:17.919 }' 00:21:17.919 10:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:17.919 10:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:18.484 10:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:18.742 [2024-07-12 10:47:53.846476] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:18.742 [2024-07-12 10:47:53.846513] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1722aa0 name Existed_Raid, state configuring 00:21:18.742 10:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:19.001 [2024-07-12 10:47:54.095162] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:19.001 [2024-07-12 10:47:54.095193] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:19.001 [2024-07-12 10:47:54.095202] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:19.001 [2024-07-12 10:47:54.095213] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:19.001 [2024-07-12 10:47:54.095222] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:19.001 [2024-07-12 10:47:54.095233] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:19.001 [2024-07-12 10:47:54.095242] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:19.001 [2024-07-12 10:47:54.095253] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:19.001 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:19.260 [2024-07-12 10:47:54.349771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:19.260 BaseBdev1 00:21:19.260 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:19.260 10:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:19.260 10:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:19.260 10:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:19.260 10:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:19.260 10:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:19.260 10:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:19.519 10:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:19.777 [ 00:21:19.777 { 00:21:19.777 "name": "BaseBdev1", 00:21:19.777 "aliases": [ 00:21:19.777 "29b3bdfa-e337-405b-9ec5-db7543d608e3" 00:21:19.777 ], 00:21:19.777 "product_name": "Malloc disk", 00:21:19.777 "block_size": 512, 00:21:19.777 "num_blocks": 65536, 00:21:19.777 "uuid": "29b3bdfa-e337-405b-9ec5-db7543d608e3", 00:21:19.777 "assigned_rate_limits": { 00:21:19.777 "rw_ios_per_sec": 0, 00:21:19.777 "rw_mbytes_per_sec": 0, 00:21:19.777 "r_mbytes_per_sec": 0, 00:21:19.777 "w_mbytes_per_sec": 0 00:21:19.777 }, 00:21:19.777 "claimed": true, 00:21:19.777 "claim_type": "exclusive_write", 00:21:19.777 "zoned": false, 00:21:19.777 "supported_io_types": { 00:21:19.777 "read": true, 00:21:19.777 "write": true, 00:21:19.777 "unmap": true, 00:21:19.777 "flush": true, 00:21:19.777 "reset": true, 00:21:19.777 "nvme_admin": false, 00:21:19.777 "nvme_io": false, 00:21:19.777 "nvme_io_md": false, 00:21:19.777 "write_zeroes": true, 00:21:19.777 "zcopy": true, 00:21:19.777 "get_zone_info": false, 00:21:19.777 "zone_management": false, 00:21:19.777 "zone_append": false, 00:21:19.777 "compare": false, 00:21:19.777 "compare_and_write": false, 00:21:19.777 "abort": true, 00:21:19.777 "seek_hole": false, 00:21:19.777 "seek_data": false, 00:21:19.777 "copy": true, 00:21:19.777 "nvme_iov_md": false 00:21:19.777 }, 00:21:19.777 "memory_domains": [ 00:21:19.777 { 00:21:19.777 "dma_device_id": "system", 00:21:19.777 "dma_device_type": 1 00:21:19.777 }, 00:21:19.777 { 00:21:19.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.777 "dma_device_type": 2 00:21:19.777 } 00:21:19.777 ], 00:21:19.777 "driver_specific": {} 00:21:19.777 } 00:21:19.777 ] 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.777 10:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:20.036 10:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.036 "name": "Existed_Raid", 00:21:20.036 "uuid": "120185d7-1377-4b2d-b95d-7433ff2c46ea", 00:21:20.036 "strip_size_kb": 0, 00:21:20.036 "state": "configuring", 00:21:20.036 "raid_level": "raid1", 00:21:20.036 "superblock": true, 00:21:20.036 "num_base_bdevs": 4, 00:21:20.036 "num_base_bdevs_discovered": 1, 00:21:20.036 "num_base_bdevs_operational": 4, 00:21:20.036 "base_bdevs_list": [ 00:21:20.036 { 00:21:20.036 "name": "BaseBdev1", 00:21:20.036 "uuid": "29b3bdfa-e337-405b-9ec5-db7543d608e3", 00:21:20.036 "is_configured": true, 00:21:20.036 "data_offset": 2048, 00:21:20.036 "data_size": 63488 00:21:20.036 }, 00:21:20.036 { 00:21:20.037 "name": "BaseBdev2", 00:21:20.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.037 "is_configured": false, 00:21:20.037 "data_offset": 0, 00:21:20.037 "data_size": 0 00:21:20.037 }, 00:21:20.037 { 00:21:20.037 "name": "BaseBdev3", 00:21:20.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.037 "is_configured": false, 00:21:20.037 "data_offset": 0, 00:21:20.037 "data_size": 0 00:21:20.037 }, 00:21:20.037 { 00:21:20.037 "name": "BaseBdev4", 00:21:20.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.037 "is_configured": false, 00:21:20.037 "data_offset": 0, 00:21:20.037 "data_size": 0 00:21:20.037 } 00:21:20.037 ] 00:21:20.037 }' 00:21:20.037 10:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.037 10:47:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:20.604 10:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:20.862 [2024-07-12 10:47:55.913931] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:20.862 [2024-07-12 10:47:55.913974] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1722310 name Existed_Raid, state configuring 00:21:20.862 10:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:21.119 [2024-07-12 10:47:56.154614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:21.119 [2024-07-12 10:47:56.156057] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:21.119 [2024-07-12 10:47:56.156090] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:21.119 [2024-07-12 10:47:56.156100] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:21.119 [2024-07-12 10:47:56.156112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:21.119 [2024-07-12 10:47:56.156121] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:21.119 [2024-07-12 10:47:56.156132] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:21.119 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:21.119 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:21.119 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.120 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:21.378 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:21.378 "name": "Existed_Raid", 00:21:21.378 "uuid": "68b7dba0-7fe5-456a-88a9-85b959017246", 00:21:21.378 "strip_size_kb": 0, 00:21:21.378 "state": "configuring", 00:21:21.378 "raid_level": "raid1", 00:21:21.378 "superblock": true, 00:21:21.378 "num_base_bdevs": 4, 00:21:21.378 "num_base_bdevs_discovered": 1, 00:21:21.378 "num_base_bdevs_operational": 4, 00:21:21.378 "base_bdevs_list": [ 00:21:21.378 { 00:21:21.378 "name": "BaseBdev1", 00:21:21.378 "uuid": "29b3bdfa-e337-405b-9ec5-db7543d608e3", 00:21:21.378 "is_configured": true, 00:21:21.378 "data_offset": 2048, 00:21:21.378 "data_size": 63488 00:21:21.378 }, 00:21:21.378 { 00:21:21.378 "name": "BaseBdev2", 00:21:21.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.378 "is_configured": false, 00:21:21.378 "data_offset": 0, 00:21:21.378 "data_size": 0 00:21:21.378 }, 00:21:21.378 { 00:21:21.378 "name": "BaseBdev3", 00:21:21.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.378 "is_configured": false, 00:21:21.378 "data_offset": 0, 00:21:21.378 "data_size": 0 00:21:21.378 }, 00:21:21.378 { 00:21:21.378 "name": "BaseBdev4", 00:21:21.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.378 "is_configured": false, 00:21:21.378 "data_offset": 0, 00:21:21.378 "data_size": 0 00:21:21.378 } 00:21:21.378 ] 00:21:21.378 }' 00:21:21.378 10:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:21.378 10:47:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:21.945 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:22.204 [2024-07-12 10:47:57.252980] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:22.204 BaseBdev2 00:21:22.204 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:22.204 10:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:22.204 10:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:22.204 10:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:22.204 10:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:22.204 10:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:22.204 10:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:22.462 10:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:22.720 [ 00:21:22.720 { 00:21:22.720 "name": "BaseBdev2", 00:21:22.720 "aliases": [ 00:21:22.720 "354f5b4c-2755-408b-adbd-a6344a707122" 00:21:22.720 ], 00:21:22.720 "product_name": "Malloc disk", 00:21:22.720 "block_size": 512, 00:21:22.720 "num_blocks": 65536, 00:21:22.720 "uuid": "354f5b4c-2755-408b-adbd-a6344a707122", 00:21:22.720 "assigned_rate_limits": { 00:21:22.720 "rw_ios_per_sec": 0, 00:21:22.720 "rw_mbytes_per_sec": 0, 00:21:22.720 "r_mbytes_per_sec": 0, 00:21:22.720 "w_mbytes_per_sec": 0 00:21:22.720 }, 00:21:22.720 "claimed": true, 00:21:22.720 "claim_type": "exclusive_write", 00:21:22.720 "zoned": false, 00:21:22.720 "supported_io_types": { 00:21:22.720 "read": true, 00:21:22.720 "write": true, 00:21:22.720 "unmap": true, 00:21:22.720 "flush": true, 00:21:22.720 "reset": true, 00:21:22.720 "nvme_admin": false, 00:21:22.720 "nvme_io": false, 00:21:22.720 "nvme_io_md": false, 00:21:22.720 "write_zeroes": true, 00:21:22.720 "zcopy": true, 00:21:22.720 "get_zone_info": false, 00:21:22.720 "zone_management": false, 00:21:22.720 "zone_append": false, 00:21:22.720 "compare": false, 00:21:22.720 "compare_and_write": false, 00:21:22.720 "abort": true, 00:21:22.720 "seek_hole": false, 00:21:22.720 "seek_data": false, 00:21:22.720 "copy": true, 00:21:22.720 "nvme_iov_md": false 00:21:22.720 }, 00:21:22.720 "memory_domains": [ 00:21:22.720 { 00:21:22.720 "dma_device_id": "system", 00:21:22.720 "dma_device_type": 1 00:21:22.720 }, 00:21:22.720 { 00:21:22.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.720 "dma_device_type": 2 00:21:22.720 } 00:21:22.720 ], 00:21:22.720 "driver_specific": {} 00:21:22.720 } 00:21:22.720 ] 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.720 10:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:22.978 10:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.978 "name": "Existed_Raid", 00:21:22.978 "uuid": "68b7dba0-7fe5-456a-88a9-85b959017246", 00:21:22.978 "strip_size_kb": 0, 00:21:22.978 "state": "configuring", 00:21:22.978 "raid_level": "raid1", 00:21:22.978 "superblock": true, 00:21:22.978 "num_base_bdevs": 4, 00:21:22.978 "num_base_bdevs_discovered": 2, 00:21:22.978 "num_base_bdevs_operational": 4, 00:21:22.978 "base_bdevs_list": [ 00:21:22.978 { 00:21:22.978 "name": "BaseBdev1", 00:21:22.978 "uuid": "29b3bdfa-e337-405b-9ec5-db7543d608e3", 00:21:22.978 "is_configured": true, 00:21:22.978 "data_offset": 2048, 00:21:22.978 "data_size": 63488 00:21:22.978 }, 00:21:22.978 { 00:21:22.978 "name": "BaseBdev2", 00:21:22.978 "uuid": "354f5b4c-2755-408b-adbd-a6344a707122", 00:21:22.978 "is_configured": true, 00:21:22.978 "data_offset": 2048, 00:21:22.978 "data_size": 63488 00:21:22.978 }, 00:21:22.978 { 00:21:22.978 "name": "BaseBdev3", 00:21:22.978 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.978 "is_configured": false, 00:21:22.978 "data_offset": 0, 00:21:22.978 "data_size": 0 00:21:22.978 }, 00:21:22.978 { 00:21:22.978 "name": "BaseBdev4", 00:21:22.978 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.978 "is_configured": false, 00:21:22.978 "data_offset": 0, 00:21:22.978 "data_size": 0 00:21:22.978 } 00:21:22.978 ] 00:21:22.978 }' 00:21:22.978 10:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.978 10:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:23.544 10:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:23.802 [2024-07-12 10:47:58.820616] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:23.802 BaseBdev3 00:21:23.802 10:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:23.802 10:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:23.802 10:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:23.802 10:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:23.802 10:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:23.802 10:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:23.802 10:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:24.061 10:47:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:24.320 [ 00:21:24.320 { 00:21:24.320 "name": "BaseBdev3", 00:21:24.320 "aliases": [ 00:21:24.320 "1742e2eb-4c40-4331-8d57-5119a629d312" 00:21:24.320 ], 00:21:24.320 "product_name": "Malloc disk", 00:21:24.320 "block_size": 512, 00:21:24.320 "num_blocks": 65536, 00:21:24.320 "uuid": "1742e2eb-4c40-4331-8d57-5119a629d312", 00:21:24.320 "assigned_rate_limits": { 00:21:24.320 "rw_ios_per_sec": 0, 00:21:24.320 "rw_mbytes_per_sec": 0, 00:21:24.320 "r_mbytes_per_sec": 0, 00:21:24.320 "w_mbytes_per_sec": 0 00:21:24.320 }, 00:21:24.320 "claimed": true, 00:21:24.320 "claim_type": "exclusive_write", 00:21:24.320 "zoned": false, 00:21:24.320 "supported_io_types": { 00:21:24.320 "read": true, 00:21:24.320 "write": true, 00:21:24.320 "unmap": true, 00:21:24.320 "flush": true, 00:21:24.320 "reset": true, 00:21:24.320 "nvme_admin": false, 00:21:24.320 "nvme_io": false, 00:21:24.320 "nvme_io_md": false, 00:21:24.320 "write_zeroes": true, 00:21:24.320 "zcopy": true, 00:21:24.320 "get_zone_info": false, 00:21:24.320 "zone_management": false, 00:21:24.320 "zone_append": false, 00:21:24.320 "compare": false, 00:21:24.320 "compare_and_write": false, 00:21:24.320 "abort": true, 00:21:24.320 "seek_hole": false, 00:21:24.320 "seek_data": false, 00:21:24.320 "copy": true, 00:21:24.320 "nvme_iov_md": false 00:21:24.320 }, 00:21:24.320 "memory_domains": [ 00:21:24.320 { 00:21:24.320 "dma_device_id": "system", 00:21:24.320 "dma_device_type": 1 00:21:24.320 }, 00:21:24.320 { 00:21:24.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.320 "dma_device_type": 2 00:21:24.320 } 00:21:24.320 ], 00:21:24.320 "driver_specific": {} 00:21:24.320 } 00:21:24.320 ] 00:21:24.320 10:47:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:24.320 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:24.320 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:24.320 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.321 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:24.579 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:24.579 "name": "Existed_Raid", 00:21:24.579 "uuid": "68b7dba0-7fe5-456a-88a9-85b959017246", 00:21:24.579 "strip_size_kb": 0, 00:21:24.579 "state": "configuring", 00:21:24.580 "raid_level": "raid1", 00:21:24.580 "superblock": true, 00:21:24.580 "num_base_bdevs": 4, 00:21:24.580 "num_base_bdevs_discovered": 3, 00:21:24.580 "num_base_bdevs_operational": 4, 00:21:24.580 "base_bdevs_list": [ 00:21:24.580 { 00:21:24.580 "name": "BaseBdev1", 00:21:24.580 "uuid": "29b3bdfa-e337-405b-9ec5-db7543d608e3", 00:21:24.580 "is_configured": true, 00:21:24.580 "data_offset": 2048, 00:21:24.580 "data_size": 63488 00:21:24.580 }, 00:21:24.580 { 00:21:24.580 "name": "BaseBdev2", 00:21:24.580 "uuid": "354f5b4c-2755-408b-adbd-a6344a707122", 00:21:24.580 "is_configured": true, 00:21:24.580 "data_offset": 2048, 00:21:24.580 "data_size": 63488 00:21:24.580 }, 00:21:24.580 { 00:21:24.580 "name": "BaseBdev3", 00:21:24.580 "uuid": "1742e2eb-4c40-4331-8d57-5119a629d312", 00:21:24.580 "is_configured": true, 00:21:24.580 "data_offset": 2048, 00:21:24.580 "data_size": 63488 00:21:24.580 }, 00:21:24.580 { 00:21:24.580 "name": "BaseBdev4", 00:21:24.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:24.580 "is_configured": false, 00:21:24.580 "data_offset": 0, 00:21:24.580 "data_size": 0 00:21:24.580 } 00:21:24.580 ] 00:21:24.580 }' 00:21:24.580 10:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:24.580 10:47:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:25.147 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:25.406 [2024-07-12 10:48:00.416294] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:25.406 [2024-07-12 10:48:00.416496] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1723350 00:21:25.406 [2024-07-12 10:48:00.416512] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:25.406 [2024-07-12 10:48:00.416694] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1723020 00:21:25.406 [2024-07-12 10:48:00.416822] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1723350 00:21:25.406 [2024-07-12 10:48:00.416832] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1723350 00:21:25.406 [2024-07-12 10:48:00.416926] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.406 BaseBdev4 00:21:25.406 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:25.406 10:48:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:25.406 10:48:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:25.406 10:48:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:25.406 10:48:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:25.406 10:48:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:25.406 10:48:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:25.664 10:48:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:25.922 [ 00:21:25.922 { 00:21:25.922 "name": "BaseBdev4", 00:21:25.922 "aliases": [ 00:21:25.922 "440bf559-fc8d-4031-ba7b-4817ec8c19a5" 00:21:25.922 ], 00:21:25.922 "product_name": "Malloc disk", 00:21:25.922 "block_size": 512, 00:21:25.922 "num_blocks": 65536, 00:21:25.922 "uuid": "440bf559-fc8d-4031-ba7b-4817ec8c19a5", 00:21:25.922 "assigned_rate_limits": { 00:21:25.922 "rw_ios_per_sec": 0, 00:21:25.922 "rw_mbytes_per_sec": 0, 00:21:25.922 "r_mbytes_per_sec": 0, 00:21:25.922 "w_mbytes_per_sec": 0 00:21:25.922 }, 00:21:25.922 "claimed": true, 00:21:25.922 "claim_type": "exclusive_write", 00:21:25.922 "zoned": false, 00:21:25.922 "supported_io_types": { 00:21:25.922 "read": true, 00:21:25.922 "write": true, 00:21:25.922 "unmap": true, 00:21:25.922 "flush": true, 00:21:25.922 "reset": true, 00:21:25.922 "nvme_admin": false, 00:21:25.922 "nvme_io": false, 00:21:25.922 "nvme_io_md": false, 00:21:25.922 "write_zeroes": true, 00:21:25.922 "zcopy": true, 00:21:25.922 "get_zone_info": false, 00:21:25.922 "zone_management": false, 00:21:25.922 "zone_append": false, 00:21:25.922 "compare": false, 00:21:25.922 "compare_and_write": false, 00:21:25.922 "abort": true, 00:21:25.922 "seek_hole": false, 00:21:25.922 "seek_data": false, 00:21:25.922 "copy": true, 00:21:25.922 "nvme_iov_md": false 00:21:25.922 }, 00:21:25.922 "memory_domains": [ 00:21:25.922 { 00:21:25.922 "dma_device_id": "system", 00:21:25.922 "dma_device_type": 1 00:21:25.922 }, 00:21:25.922 { 00:21:25.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:25.922 "dma_device_type": 2 00:21:25.922 } 00:21:25.922 ], 00:21:25.922 "driver_specific": {} 00:21:25.922 } 00:21:25.922 ] 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.922 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.923 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.923 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.923 10:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:26.181 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:26.181 "name": "Existed_Raid", 00:21:26.181 "uuid": "68b7dba0-7fe5-456a-88a9-85b959017246", 00:21:26.181 "strip_size_kb": 0, 00:21:26.181 "state": "online", 00:21:26.181 "raid_level": "raid1", 00:21:26.181 "superblock": true, 00:21:26.181 "num_base_bdevs": 4, 00:21:26.181 "num_base_bdevs_discovered": 4, 00:21:26.181 "num_base_bdevs_operational": 4, 00:21:26.181 "base_bdevs_list": [ 00:21:26.181 { 00:21:26.181 "name": "BaseBdev1", 00:21:26.181 "uuid": "29b3bdfa-e337-405b-9ec5-db7543d608e3", 00:21:26.181 "is_configured": true, 00:21:26.181 "data_offset": 2048, 00:21:26.181 "data_size": 63488 00:21:26.181 }, 00:21:26.181 { 00:21:26.181 "name": "BaseBdev2", 00:21:26.181 "uuid": "354f5b4c-2755-408b-adbd-a6344a707122", 00:21:26.181 "is_configured": true, 00:21:26.181 "data_offset": 2048, 00:21:26.181 "data_size": 63488 00:21:26.181 }, 00:21:26.181 { 00:21:26.181 "name": "BaseBdev3", 00:21:26.181 "uuid": "1742e2eb-4c40-4331-8d57-5119a629d312", 00:21:26.181 "is_configured": true, 00:21:26.181 "data_offset": 2048, 00:21:26.181 "data_size": 63488 00:21:26.181 }, 00:21:26.181 { 00:21:26.181 "name": "BaseBdev4", 00:21:26.181 "uuid": "440bf559-fc8d-4031-ba7b-4817ec8c19a5", 00:21:26.181 "is_configured": true, 00:21:26.181 "data_offset": 2048, 00:21:26.181 "data_size": 63488 00:21:26.181 } 00:21:26.181 ] 00:21:26.181 }' 00:21:26.181 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:26.181 10:48:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:26.747 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:26.747 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:26.747 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:26.747 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:26.747 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:26.747 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:26.747 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:26.747 10:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:27.006 [2024-07-12 10:48:01.988817] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:27.006 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:27.006 "name": "Existed_Raid", 00:21:27.006 "aliases": [ 00:21:27.006 "68b7dba0-7fe5-456a-88a9-85b959017246" 00:21:27.006 ], 00:21:27.006 "product_name": "Raid Volume", 00:21:27.006 "block_size": 512, 00:21:27.006 "num_blocks": 63488, 00:21:27.006 "uuid": "68b7dba0-7fe5-456a-88a9-85b959017246", 00:21:27.006 "assigned_rate_limits": { 00:21:27.006 "rw_ios_per_sec": 0, 00:21:27.006 "rw_mbytes_per_sec": 0, 00:21:27.006 "r_mbytes_per_sec": 0, 00:21:27.006 "w_mbytes_per_sec": 0 00:21:27.006 }, 00:21:27.006 "claimed": false, 00:21:27.006 "zoned": false, 00:21:27.006 "supported_io_types": { 00:21:27.006 "read": true, 00:21:27.006 "write": true, 00:21:27.006 "unmap": false, 00:21:27.006 "flush": false, 00:21:27.006 "reset": true, 00:21:27.006 "nvme_admin": false, 00:21:27.006 "nvme_io": false, 00:21:27.006 "nvme_io_md": false, 00:21:27.006 "write_zeroes": true, 00:21:27.006 "zcopy": false, 00:21:27.006 "get_zone_info": false, 00:21:27.006 "zone_management": false, 00:21:27.006 "zone_append": false, 00:21:27.006 "compare": false, 00:21:27.006 "compare_and_write": false, 00:21:27.006 "abort": false, 00:21:27.006 "seek_hole": false, 00:21:27.006 "seek_data": false, 00:21:27.006 "copy": false, 00:21:27.006 "nvme_iov_md": false 00:21:27.006 }, 00:21:27.006 "memory_domains": [ 00:21:27.006 { 00:21:27.006 "dma_device_id": "system", 00:21:27.006 "dma_device_type": 1 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.006 "dma_device_type": 2 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "dma_device_id": "system", 00:21:27.006 "dma_device_type": 1 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.006 "dma_device_type": 2 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "dma_device_id": "system", 00:21:27.006 "dma_device_type": 1 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.006 "dma_device_type": 2 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "dma_device_id": "system", 00:21:27.006 "dma_device_type": 1 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.006 "dma_device_type": 2 00:21:27.006 } 00:21:27.006 ], 00:21:27.006 "driver_specific": { 00:21:27.006 "raid": { 00:21:27.006 "uuid": "68b7dba0-7fe5-456a-88a9-85b959017246", 00:21:27.006 "strip_size_kb": 0, 00:21:27.006 "state": "online", 00:21:27.006 "raid_level": "raid1", 00:21:27.006 "superblock": true, 00:21:27.006 "num_base_bdevs": 4, 00:21:27.006 "num_base_bdevs_discovered": 4, 00:21:27.006 "num_base_bdevs_operational": 4, 00:21:27.006 "base_bdevs_list": [ 00:21:27.006 { 00:21:27.006 "name": "BaseBdev1", 00:21:27.006 "uuid": "29b3bdfa-e337-405b-9ec5-db7543d608e3", 00:21:27.006 "is_configured": true, 00:21:27.006 "data_offset": 2048, 00:21:27.006 "data_size": 63488 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "name": "BaseBdev2", 00:21:27.006 "uuid": "354f5b4c-2755-408b-adbd-a6344a707122", 00:21:27.006 "is_configured": true, 00:21:27.006 "data_offset": 2048, 00:21:27.006 "data_size": 63488 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "name": "BaseBdev3", 00:21:27.006 "uuid": "1742e2eb-4c40-4331-8d57-5119a629d312", 00:21:27.006 "is_configured": true, 00:21:27.006 "data_offset": 2048, 00:21:27.006 "data_size": 63488 00:21:27.006 }, 00:21:27.006 { 00:21:27.006 "name": "BaseBdev4", 00:21:27.006 "uuid": "440bf559-fc8d-4031-ba7b-4817ec8c19a5", 00:21:27.006 "is_configured": true, 00:21:27.006 "data_offset": 2048, 00:21:27.006 "data_size": 63488 00:21:27.006 } 00:21:27.006 ] 00:21:27.006 } 00:21:27.006 } 00:21:27.006 }' 00:21:27.006 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:27.006 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:27.006 BaseBdev2 00:21:27.006 BaseBdev3 00:21:27.006 BaseBdev4' 00:21:27.006 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:27.006 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:27.006 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:27.318 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:27.318 "name": "BaseBdev1", 00:21:27.318 "aliases": [ 00:21:27.318 "29b3bdfa-e337-405b-9ec5-db7543d608e3" 00:21:27.318 ], 00:21:27.318 "product_name": "Malloc disk", 00:21:27.318 "block_size": 512, 00:21:27.318 "num_blocks": 65536, 00:21:27.318 "uuid": "29b3bdfa-e337-405b-9ec5-db7543d608e3", 00:21:27.318 "assigned_rate_limits": { 00:21:27.318 "rw_ios_per_sec": 0, 00:21:27.318 "rw_mbytes_per_sec": 0, 00:21:27.318 "r_mbytes_per_sec": 0, 00:21:27.318 "w_mbytes_per_sec": 0 00:21:27.318 }, 00:21:27.318 "claimed": true, 00:21:27.318 "claim_type": "exclusive_write", 00:21:27.318 "zoned": false, 00:21:27.318 "supported_io_types": { 00:21:27.318 "read": true, 00:21:27.318 "write": true, 00:21:27.318 "unmap": true, 00:21:27.318 "flush": true, 00:21:27.318 "reset": true, 00:21:27.318 "nvme_admin": false, 00:21:27.318 "nvme_io": false, 00:21:27.318 "nvme_io_md": false, 00:21:27.318 "write_zeroes": true, 00:21:27.318 "zcopy": true, 00:21:27.318 "get_zone_info": false, 00:21:27.318 "zone_management": false, 00:21:27.318 "zone_append": false, 00:21:27.318 "compare": false, 00:21:27.319 "compare_and_write": false, 00:21:27.319 "abort": true, 00:21:27.319 "seek_hole": false, 00:21:27.319 "seek_data": false, 00:21:27.319 "copy": true, 00:21:27.319 "nvme_iov_md": false 00:21:27.319 }, 00:21:27.319 "memory_domains": [ 00:21:27.319 { 00:21:27.319 "dma_device_id": "system", 00:21:27.319 "dma_device_type": 1 00:21:27.319 }, 00:21:27.319 { 00:21:27.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.319 "dma_device_type": 2 00:21:27.319 } 00:21:27.319 ], 00:21:27.319 "driver_specific": {} 00:21:27.319 }' 00:21:27.319 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.319 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.319 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:27.319 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.319 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.319 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:27.319 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.577 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.577 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:27.577 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.577 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.577 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:27.577 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:27.577 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:27.577 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:27.838 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:27.838 "name": "BaseBdev2", 00:21:27.838 "aliases": [ 00:21:27.838 "354f5b4c-2755-408b-adbd-a6344a707122" 00:21:27.838 ], 00:21:27.838 "product_name": "Malloc disk", 00:21:27.838 "block_size": 512, 00:21:27.838 "num_blocks": 65536, 00:21:27.838 "uuid": "354f5b4c-2755-408b-adbd-a6344a707122", 00:21:27.838 "assigned_rate_limits": { 00:21:27.838 "rw_ios_per_sec": 0, 00:21:27.838 "rw_mbytes_per_sec": 0, 00:21:27.838 "r_mbytes_per_sec": 0, 00:21:27.838 "w_mbytes_per_sec": 0 00:21:27.838 }, 00:21:27.838 "claimed": true, 00:21:27.838 "claim_type": "exclusive_write", 00:21:27.838 "zoned": false, 00:21:27.838 "supported_io_types": { 00:21:27.838 "read": true, 00:21:27.838 "write": true, 00:21:27.838 "unmap": true, 00:21:27.838 "flush": true, 00:21:27.838 "reset": true, 00:21:27.838 "nvme_admin": false, 00:21:27.838 "nvme_io": false, 00:21:27.838 "nvme_io_md": false, 00:21:27.838 "write_zeroes": true, 00:21:27.838 "zcopy": true, 00:21:27.838 "get_zone_info": false, 00:21:27.838 "zone_management": false, 00:21:27.838 "zone_append": false, 00:21:27.838 "compare": false, 00:21:27.838 "compare_and_write": false, 00:21:27.838 "abort": true, 00:21:27.838 "seek_hole": false, 00:21:27.838 "seek_data": false, 00:21:27.838 "copy": true, 00:21:27.838 "nvme_iov_md": false 00:21:27.838 }, 00:21:27.838 "memory_domains": [ 00:21:27.838 { 00:21:27.838 "dma_device_id": "system", 00:21:27.838 "dma_device_type": 1 00:21:27.838 }, 00:21:27.838 { 00:21:27.838 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.838 "dma_device_type": 2 00:21:27.838 } 00:21:27.838 ], 00:21:27.838 "driver_specific": {} 00:21:27.838 }' 00:21:27.838 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.838 10:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.838 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:27.838 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:28.095 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:28.353 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:28.353 "name": "BaseBdev3", 00:21:28.353 "aliases": [ 00:21:28.353 "1742e2eb-4c40-4331-8d57-5119a629d312" 00:21:28.353 ], 00:21:28.353 "product_name": "Malloc disk", 00:21:28.353 "block_size": 512, 00:21:28.353 "num_blocks": 65536, 00:21:28.353 "uuid": "1742e2eb-4c40-4331-8d57-5119a629d312", 00:21:28.353 "assigned_rate_limits": { 00:21:28.353 "rw_ios_per_sec": 0, 00:21:28.353 "rw_mbytes_per_sec": 0, 00:21:28.353 "r_mbytes_per_sec": 0, 00:21:28.353 "w_mbytes_per_sec": 0 00:21:28.353 }, 00:21:28.353 "claimed": true, 00:21:28.353 "claim_type": "exclusive_write", 00:21:28.353 "zoned": false, 00:21:28.353 "supported_io_types": { 00:21:28.353 "read": true, 00:21:28.353 "write": true, 00:21:28.353 "unmap": true, 00:21:28.353 "flush": true, 00:21:28.353 "reset": true, 00:21:28.353 "nvme_admin": false, 00:21:28.353 "nvme_io": false, 00:21:28.353 "nvme_io_md": false, 00:21:28.353 "write_zeroes": true, 00:21:28.353 "zcopy": true, 00:21:28.353 "get_zone_info": false, 00:21:28.353 "zone_management": false, 00:21:28.353 "zone_append": false, 00:21:28.353 "compare": false, 00:21:28.353 "compare_and_write": false, 00:21:28.353 "abort": true, 00:21:28.353 "seek_hole": false, 00:21:28.353 "seek_data": false, 00:21:28.353 "copy": true, 00:21:28.353 "nvme_iov_md": false 00:21:28.353 }, 00:21:28.353 "memory_domains": [ 00:21:28.353 { 00:21:28.353 "dma_device_id": "system", 00:21:28.353 "dma_device_type": 1 00:21:28.353 }, 00:21:28.353 { 00:21:28.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:28.353 "dma_device_type": 2 00:21:28.353 } 00:21:28.353 ], 00:21:28.353 "driver_specific": {} 00:21:28.353 }' 00:21:28.353 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:28.353 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:28.612 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:28.612 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.612 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.612 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:28.612 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.612 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.612 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:28.612 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.612 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.870 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:28.870 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:28.870 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:28.870 10:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:29.128 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:29.128 "name": "BaseBdev4", 00:21:29.128 "aliases": [ 00:21:29.128 "440bf559-fc8d-4031-ba7b-4817ec8c19a5" 00:21:29.128 ], 00:21:29.128 "product_name": "Malloc disk", 00:21:29.128 "block_size": 512, 00:21:29.128 "num_blocks": 65536, 00:21:29.128 "uuid": "440bf559-fc8d-4031-ba7b-4817ec8c19a5", 00:21:29.128 "assigned_rate_limits": { 00:21:29.128 "rw_ios_per_sec": 0, 00:21:29.128 "rw_mbytes_per_sec": 0, 00:21:29.128 "r_mbytes_per_sec": 0, 00:21:29.128 "w_mbytes_per_sec": 0 00:21:29.128 }, 00:21:29.128 "claimed": true, 00:21:29.128 "claim_type": "exclusive_write", 00:21:29.128 "zoned": false, 00:21:29.128 "supported_io_types": { 00:21:29.128 "read": true, 00:21:29.128 "write": true, 00:21:29.128 "unmap": true, 00:21:29.128 "flush": true, 00:21:29.128 "reset": true, 00:21:29.128 "nvme_admin": false, 00:21:29.128 "nvme_io": false, 00:21:29.128 "nvme_io_md": false, 00:21:29.128 "write_zeroes": true, 00:21:29.128 "zcopy": true, 00:21:29.128 "get_zone_info": false, 00:21:29.128 "zone_management": false, 00:21:29.128 "zone_append": false, 00:21:29.128 "compare": false, 00:21:29.128 "compare_and_write": false, 00:21:29.128 "abort": true, 00:21:29.128 "seek_hole": false, 00:21:29.128 "seek_data": false, 00:21:29.128 "copy": true, 00:21:29.128 "nvme_iov_md": false 00:21:29.128 }, 00:21:29.128 "memory_domains": [ 00:21:29.128 { 00:21:29.128 "dma_device_id": "system", 00:21:29.128 "dma_device_type": 1 00:21:29.128 }, 00:21:29.128 { 00:21:29.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:29.128 "dma_device_type": 2 00:21:29.128 } 00:21:29.128 ], 00:21:29.128 "driver_specific": {} 00:21:29.128 }' 00:21:29.128 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:29.128 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:29.128 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:29.128 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:29.128 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:29.129 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:29.129 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:29.129 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:29.387 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:29.387 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:29.387 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:29.387 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:29.387 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:29.646 [2024-07-12 10:48:04.651597] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.646 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:29.905 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:29.905 "name": "Existed_Raid", 00:21:29.905 "uuid": "68b7dba0-7fe5-456a-88a9-85b959017246", 00:21:29.905 "strip_size_kb": 0, 00:21:29.905 "state": "online", 00:21:29.905 "raid_level": "raid1", 00:21:29.905 "superblock": true, 00:21:29.905 "num_base_bdevs": 4, 00:21:29.905 "num_base_bdevs_discovered": 3, 00:21:29.905 "num_base_bdevs_operational": 3, 00:21:29.905 "base_bdevs_list": [ 00:21:29.905 { 00:21:29.905 "name": null, 00:21:29.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:29.905 "is_configured": false, 00:21:29.905 "data_offset": 2048, 00:21:29.905 "data_size": 63488 00:21:29.905 }, 00:21:29.905 { 00:21:29.905 "name": "BaseBdev2", 00:21:29.905 "uuid": "354f5b4c-2755-408b-adbd-a6344a707122", 00:21:29.905 "is_configured": true, 00:21:29.905 "data_offset": 2048, 00:21:29.905 "data_size": 63488 00:21:29.905 }, 00:21:29.905 { 00:21:29.905 "name": "BaseBdev3", 00:21:29.905 "uuid": "1742e2eb-4c40-4331-8d57-5119a629d312", 00:21:29.905 "is_configured": true, 00:21:29.905 "data_offset": 2048, 00:21:29.905 "data_size": 63488 00:21:29.905 }, 00:21:29.905 { 00:21:29.905 "name": "BaseBdev4", 00:21:29.905 "uuid": "440bf559-fc8d-4031-ba7b-4817ec8c19a5", 00:21:29.905 "is_configured": true, 00:21:29.905 "data_offset": 2048, 00:21:29.905 "data_size": 63488 00:21:29.905 } 00:21:29.905 ] 00:21:29.905 }' 00:21:29.905 10:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:29.905 10:48:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:30.472 10:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:30.472 10:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:30.472 10:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.472 10:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:30.749 10:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:30.749 10:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:30.749 10:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:31.033 [2024-07-12 10:48:05.993130] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:31.033 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:31.033 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:31.033 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.033 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:31.292 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:31.292 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:31.292 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:31.551 [2024-07-12 10:48:06.498981] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:31.551 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:31.551 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:31.551 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.551 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:31.810 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:31.810 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:31.810 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:31.810 [2024-07-12 10:48:06.926884] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:31.810 [2024-07-12 10:48:06.926973] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:31.810 [2024-07-12 10:48:06.937821] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:31.810 [2024-07-12 10:48:06.937856] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:31.810 [2024-07-12 10:48:06.937868] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1723350 name Existed_Raid, state offline 00:21:31.810 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:31.810 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:31.810 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.810 10:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:32.069 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:32.069 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:32.069 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:32.069 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:32.069 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:32.069 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:32.328 BaseBdev2 00:21:32.328 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:32.328 10:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:32.328 10:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:32.328 10:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:32.328 10:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:32.329 10:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:32.329 10:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:32.588 10:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:32.847 [ 00:21:32.847 { 00:21:32.847 "name": "BaseBdev2", 00:21:32.847 "aliases": [ 00:21:32.847 "e4db50c2-d284-4907-bc0e-4821075866b6" 00:21:32.847 ], 00:21:32.847 "product_name": "Malloc disk", 00:21:32.847 "block_size": 512, 00:21:32.847 "num_blocks": 65536, 00:21:32.847 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:32.847 "assigned_rate_limits": { 00:21:32.847 "rw_ios_per_sec": 0, 00:21:32.847 "rw_mbytes_per_sec": 0, 00:21:32.847 "r_mbytes_per_sec": 0, 00:21:32.847 "w_mbytes_per_sec": 0 00:21:32.847 }, 00:21:32.847 "claimed": false, 00:21:32.847 "zoned": false, 00:21:32.847 "supported_io_types": { 00:21:32.847 "read": true, 00:21:32.847 "write": true, 00:21:32.847 "unmap": true, 00:21:32.847 "flush": true, 00:21:32.847 "reset": true, 00:21:32.847 "nvme_admin": false, 00:21:32.847 "nvme_io": false, 00:21:32.847 "nvme_io_md": false, 00:21:32.847 "write_zeroes": true, 00:21:32.847 "zcopy": true, 00:21:32.847 "get_zone_info": false, 00:21:32.847 "zone_management": false, 00:21:32.847 "zone_append": false, 00:21:32.847 "compare": false, 00:21:32.847 "compare_and_write": false, 00:21:32.847 "abort": true, 00:21:32.848 "seek_hole": false, 00:21:32.848 "seek_data": false, 00:21:32.848 "copy": true, 00:21:32.848 "nvme_iov_md": false 00:21:32.848 }, 00:21:32.848 "memory_domains": [ 00:21:32.848 { 00:21:32.848 "dma_device_id": "system", 00:21:32.848 "dma_device_type": 1 00:21:32.848 }, 00:21:32.848 { 00:21:32.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.848 "dma_device_type": 2 00:21:32.848 } 00:21:32.848 ], 00:21:32.848 "driver_specific": {} 00:21:32.848 } 00:21:32.848 ] 00:21:32.848 10:48:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:32.848 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:32.848 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:32.848 10:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:33.107 BaseBdev3 00:21:33.107 10:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:33.107 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:33.107 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:33.107 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:33.107 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:33.107 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:33.107 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:33.366 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:33.625 [ 00:21:33.625 { 00:21:33.625 "name": "BaseBdev3", 00:21:33.625 "aliases": [ 00:21:33.625 "8f069435-076a-4b3e-80ce-169e1688b8cc" 00:21:33.625 ], 00:21:33.625 "product_name": "Malloc disk", 00:21:33.625 "block_size": 512, 00:21:33.625 "num_blocks": 65536, 00:21:33.625 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:33.625 "assigned_rate_limits": { 00:21:33.625 "rw_ios_per_sec": 0, 00:21:33.625 "rw_mbytes_per_sec": 0, 00:21:33.625 "r_mbytes_per_sec": 0, 00:21:33.625 "w_mbytes_per_sec": 0 00:21:33.625 }, 00:21:33.625 "claimed": false, 00:21:33.625 "zoned": false, 00:21:33.625 "supported_io_types": { 00:21:33.625 "read": true, 00:21:33.625 "write": true, 00:21:33.625 "unmap": true, 00:21:33.625 "flush": true, 00:21:33.625 "reset": true, 00:21:33.625 "nvme_admin": false, 00:21:33.625 "nvme_io": false, 00:21:33.625 "nvme_io_md": false, 00:21:33.625 "write_zeroes": true, 00:21:33.625 "zcopy": true, 00:21:33.625 "get_zone_info": false, 00:21:33.625 "zone_management": false, 00:21:33.625 "zone_append": false, 00:21:33.625 "compare": false, 00:21:33.625 "compare_and_write": false, 00:21:33.625 "abort": true, 00:21:33.625 "seek_hole": false, 00:21:33.625 "seek_data": false, 00:21:33.625 "copy": true, 00:21:33.625 "nvme_iov_md": false 00:21:33.625 }, 00:21:33.625 "memory_domains": [ 00:21:33.625 { 00:21:33.625 "dma_device_id": "system", 00:21:33.625 "dma_device_type": 1 00:21:33.625 }, 00:21:33.625 { 00:21:33.625 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.625 "dma_device_type": 2 00:21:33.625 } 00:21:33.625 ], 00:21:33.625 "driver_specific": {} 00:21:33.625 } 00:21:33.625 ] 00:21:33.625 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:33.625 10:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:33.625 10:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:33.625 10:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:33.884 BaseBdev4 00:21:33.884 10:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:33.884 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:33.884 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:33.884 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:33.884 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:33.884 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:33.884 10:48:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:34.144 10:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:34.144 [ 00:21:34.144 { 00:21:34.144 "name": "BaseBdev4", 00:21:34.144 "aliases": [ 00:21:34.144 "01439852-ee0e-467f-b7fb-a779c57ce53e" 00:21:34.144 ], 00:21:34.144 "product_name": "Malloc disk", 00:21:34.144 "block_size": 512, 00:21:34.144 "num_blocks": 65536, 00:21:34.144 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:34.144 "assigned_rate_limits": { 00:21:34.144 "rw_ios_per_sec": 0, 00:21:34.144 "rw_mbytes_per_sec": 0, 00:21:34.144 "r_mbytes_per_sec": 0, 00:21:34.144 "w_mbytes_per_sec": 0 00:21:34.144 }, 00:21:34.144 "claimed": false, 00:21:34.144 "zoned": false, 00:21:34.144 "supported_io_types": { 00:21:34.144 "read": true, 00:21:34.144 "write": true, 00:21:34.144 "unmap": true, 00:21:34.144 "flush": true, 00:21:34.144 "reset": true, 00:21:34.144 "nvme_admin": false, 00:21:34.144 "nvme_io": false, 00:21:34.144 "nvme_io_md": false, 00:21:34.144 "write_zeroes": true, 00:21:34.144 "zcopy": true, 00:21:34.144 "get_zone_info": false, 00:21:34.144 "zone_management": false, 00:21:34.144 "zone_append": false, 00:21:34.144 "compare": false, 00:21:34.144 "compare_and_write": false, 00:21:34.144 "abort": true, 00:21:34.144 "seek_hole": false, 00:21:34.144 "seek_data": false, 00:21:34.144 "copy": true, 00:21:34.144 "nvme_iov_md": false 00:21:34.144 }, 00:21:34.144 "memory_domains": [ 00:21:34.144 { 00:21:34.144 "dma_device_id": "system", 00:21:34.144 "dma_device_type": 1 00:21:34.144 }, 00:21:34.144 { 00:21:34.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.144 "dma_device_type": 2 00:21:34.144 } 00:21:34.144 ], 00:21:34.144 "driver_specific": {} 00:21:34.144 } 00:21:34.144 ] 00:21:34.144 10:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:34.144 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:34.144 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:34.144 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:34.404 [2024-07-12 10:48:09.456206] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:34.404 [2024-07-12 10:48:09.456245] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:34.404 [2024-07-12 10:48:09.456263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:34.404 [2024-07-12 10:48:09.457571] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:34.404 [2024-07-12 10:48:09.457612] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.404 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:34.663 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:34.663 "name": "Existed_Raid", 00:21:34.663 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:34.663 "strip_size_kb": 0, 00:21:34.663 "state": "configuring", 00:21:34.663 "raid_level": "raid1", 00:21:34.663 "superblock": true, 00:21:34.663 "num_base_bdevs": 4, 00:21:34.663 "num_base_bdevs_discovered": 3, 00:21:34.663 "num_base_bdevs_operational": 4, 00:21:34.663 "base_bdevs_list": [ 00:21:34.663 { 00:21:34.663 "name": "BaseBdev1", 00:21:34.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:34.663 "is_configured": false, 00:21:34.663 "data_offset": 0, 00:21:34.663 "data_size": 0 00:21:34.663 }, 00:21:34.663 { 00:21:34.663 "name": "BaseBdev2", 00:21:34.663 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:34.663 "is_configured": true, 00:21:34.663 "data_offset": 2048, 00:21:34.663 "data_size": 63488 00:21:34.663 }, 00:21:34.663 { 00:21:34.663 "name": "BaseBdev3", 00:21:34.663 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:34.663 "is_configured": true, 00:21:34.663 "data_offset": 2048, 00:21:34.663 "data_size": 63488 00:21:34.663 }, 00:21:34.663 { 00:21:34.663 "name": "BaseBdev4", 00:21:34.663 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:34.663 "is_configured": true, 00:21:34.663 "data_offset": 2048, 00:21:34.663 "data_size": 63488 00:21:34.663 } 00:21:34.663 ] 00:21:34.663 }' 00:21:34.663 10:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:34.663 10:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:35.232 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:35.490 [2024-07-12 10:48:10.567193] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.490 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:35.750 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.750 "name": "Existed_Raid", 00:21:35.750 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:35.750 "strip_size_kb": 0, 00:21:35.750 "state": "configuring", 00:21:35.750 "raid_level": "raid1", 00:21:35.750 "superblock": true, 00:21:35.750 "num_base_bdevs": 4, 00:21:35.750 "num_base_bdevs_discovered": 2, 00:21:35.750 "num_base_bdevs_operational": 4, 00:21:35.750 "base_bdevs_list": [ 00:21:35.750 { 00:21:35.750 "name": "BaseBdev1", 00:21:35.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.750 "is_configured": false, 00:21:35.750 "data_offset": 0, 00:21:35.750 "data_size": 0 00:21:35.750 }, 00:21:35.750 { 00:21:35.750 "name": null, 00:21:35.750 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:35.750 "is_configured": false, 00:21:35.750 "data_offset": 2048, 00:21:35.750 "data_size": 63488 00:21:35.750 }, 00:21:35.750 { 00:21:35.750 "name": "BaseBdev3", 00:21:35.750 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:35.750 "is_configured": true, 00:21:35.750 "data_offset": 2048, 00:21:35.750 "data_size": 63488 00:21:35.750 }, 00:21:35.750 { 00:21:35.750 "name": "BaseBdev4", 00:21:35.750 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:35.750 "is_configured": true, 00:21:35.750 "data_offset": 2048, 00:21:35.750 "data_size": 63488 00:21:35.750 } 00:21:35.750 ] 00:21:35.750 }' 00:21:35.750 10:48:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.750 10:48:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:36.318 10:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.318 10:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:36.577 10:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:36.577 10:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:36.836 [2024-07-12 10:48:11.915087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:36.836 BaseBdev1 00:21:36.836 10:48:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:36.836 10:48:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:36.836 10:48:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:36.836 10:48:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:36.836 10:48:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:36.836 10:48:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:36.836 10:48:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:37.095 10:48:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:37.354 [ 00:21:37.354 { 00:21:37.354 "name": "BaseBdev1", 00:21:37.354 "aliases": [ 00:21:37.354 "daf97543-16b7-4649-9352-20df7da034e4" 00:21:37.354 ], 00:21:37.354 "product_name": "Malloc disk", 00:21:37.354 "block_size": 512, 00:21:37.354 "num_blocks": 65536, 00:21:37.354 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:37.354 "assigned_rate_limits": { 00:21:37.354 "rw_ios_per_sec": 0, 00:21:37.354 "rw_mbytes_per_sec": 0, 00:21:37.354 "r_mbytes_per_sec": 0, 00:21:37.354 "w_mbytes_per_sec": 0 00:21:37.354 }, 00:21:37.354 "claimed": true, 00:21:37.354 "claim_type": "exclusive_write", 00:21:37.354 "zoned": false, 00:21:37.354 "supported_io_types": { 00:21:37.354 "read": true, 00:21:37.354 "write": true, 00:21:37.354 "unmap": true, 00:21:37.354 "flush": true, 00:21:37.354 "reset": true, 00:21:37.354 "nvme_admin": false, 00:21:37.354 "nvme_io": false, 00:21:37.354 "nvme_io_md": false, 00:21:37.354 "write_zeroes": true, 00:21:37.354 "zcopy": true, 00:21:37.354 "get_zone_info": false, 00:21:37.354 "zone_management": false, 00:21:37.354 "zone_append": false, 00:21:37.354 "compare": false, 00:21:37.354 "compare_and_write": false, 00:21:37.354 "abort": true, 00:21:37.354 "seek_hole": false, 00:21:37.354 "seek_data": false, 00:21:37.354 "copy": true, 00:21:37.354 "nvme_iov_md": false 00:21:37.354 }, 00:21:37.354 "memory_domains": [ 00:21:37.354 { 00:21:37.354 "dma_device_id": "system", 00:21:37.354 "dma_device_type": 1 00:21:37.354 }, 00:21:37.354 { 00:21:37.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.354 "dma_device_type": 2 00:21:37.354 } 00:21:37.354 ], 00:21:37.354 "driver_specific": {} 00:21:37.354 } 00:21:37.354 ] 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.354 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:37.613 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.613 "name": "Existed_Raid", 00:21:37.613 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:37.613 "strip_size_kb": 0, 00:21:37.613 "state": "configuring", 00:21:37.613 "raid_level": "raid1", 00:21:37.613 "superblock": true, 00:21:37.613 "num_base_bdevs": 4, 00:21:37.613 "num_base_bdevs_discovered": 3, 00:21:37.613 "num_base_bdevs_operational": 4, 00:21:37.613 "base_bdevs_list": [ 00:21:37.613 { 00:21:37.613 "name": "BaseBdev1", 00:21:37.613 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:37.613 "is_configured": true, 00:21:37.613 "data_offset": 2048, 00:21:37.613 "data_size": 63488 00:21:37.613 }, 00:21:37.613 { 00:21:37.613 "name": null, 00:21:37.613 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:37.613 "is_configured": false, 00:21:37.613 "data_offset": 2048, 00:21:37.613 "data_size": 63488 00:21:37.613 }, 00:21:37.613 { 00:21:37.613 "name": "BaseBdev3", 00:21:37.613 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:37.613 "is_configured": true, 00:21:37.613 "data_offset": 2048, 00:21:37.613 "data_size": 63488 00:21:37.613 }, 00:21:37.613 { 00:21:37.613 "name": "BaseBdev4", 00:21:37.613 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:37.613 "is_configured": true, 00:21:37.613 "data_offset": 2048, 00:21:37.613 "data_size": 63488 00:21:37.613 } 00:21:37.613 ] 00:21:37.613 }' 00:21:37.613 10:48:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.613 10:48:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:38.181 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.181 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:38.439 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:38.439 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:38.697 [2024-07-12 10:48:13.759994] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.697 10:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:38.954 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:38.954 "name": "Existed_Raid", 00:21:38.954 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:38.954 "strip_size_kb": 0, 00:21:38.954 "state": "configuring", 00:21:38.954 "raid_level": "raid1", 00:21:38.954 "superblock": true, 00:21:38.954 "num_base_bdevs": 4, 00:21:38.954 "num_base_bdevs_discovered": 2, 00:21:38.954 "num_base_bdevs_operational": 4, 00:21:38.954 "base_bdevs_list": [ 00:21:38.954 { 00:21:38.954 "name": "BaseBdev1", 00:21:38.954 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:38.954 "is_configured": true, 00:21:38.954 "data_offset": 2048, 00:21:38.954 "data_size": 63488 00:21:38.954 }, 00:21:38.954 { 00:21:38.954 "name": null, 00:21:38.954 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:38.954 "is_configured": false, 00:21:38.954 "data_offset": 2048, 00:21:38.954 "data_size": 63488 00:21:38.954 }, 00:21:38.954 { 00:21:38.954 "name": null, 00:21:38.954 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:38.954 "is_configured": false, 00:21:38.954 "data_offset": 2048, 00:21:38.954 "data_size": 63488 00:21:38.954 }, 00:21:38.954 { 00:21:38.954 "name": "BaseBdev4", 00:21:38.954 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:38.954 "is_configured": true, 00:21:38.954 "data_offset": 2048, 00:21:38.954 "data_size": 63488 00:21:38.954 } 00:21:38.954 ] 00:21:38.954 }' 00:21:38.954 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:38.954 10:48:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:39.519 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.519 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:39.779 [2024-07-12 10:48:14.943147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.779 10:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:40.038 10:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.038 "name": "Existed_Raid", 00:21:40.038 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:40.038 "strip_size_kb": 0, 00:21:40.038 "state": "configuring", 00:21:40.038 "raid_level": "raid1", 00:21:40.038 "superblock": true, 00:21:40.038 "num_base_bdevs": 4, 00:21:40.038 "num_base_bdevs_discovered": 3, 00:21:40.038 "num_base_bdevs_operational": 4, 00:21:40.038 "base_bdevs_list": [ 00:21:40.038 { 00:21:40.038 "name": "BaseBdev1", 00:21:40.038 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:40.038 "is_configured": true, 00:21:40.038 "data_offset": 2048, 00:21:40.038 "data_size": 63488 00:21:40.039 }, 00:21:40.039 { 00:21:40.039 "name": null, 00:21:40.039 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:40.039 "is_configured": false, 00:21:40.039 "data_offset": 2048, 00:21:40.039 "data_size": 63488 00:21:40.039 }, 00:21:40.039 { 00:21:40.039 "name": "BaseBdev3", 00:21:40.039 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:40.039 "is_configured": true, 00:21:40.039 "data_offset": 2048, 00:21:40.039 "data_size": 63488 00:21:40.039 }, 00:21:40.039 { 00:21:40.039 "name": "BaseBdev4", 00:21:40.039 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:40.039 "is_configured": true, 00:21:40.039 "data_offset": 2048, 00:21:40.039 "data_size": 63488 00:21:40.039 } 00:21:40.039 ] 00:21:40.039 }' 00:21:40.039 10:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.039 10:48:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:40.607 10:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.607 10:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:40.866 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:40.866 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:41.126 [2024-07-12 10:48:16.254641] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.126 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:41.386 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.386 "name": "Existed_Raid", 00:21:41.386 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:41.386 "strip_size_kb": 0, 00:21:41.386 "state": "configuring", 00:21:41.386 "raid_level": "raid1", 00:21:41.386 "superblock": true, 00:21:41.386 "num_base_bdevs": 4, 00:21:41.386 "num_base_bdevs_discovered": 2, 00:21:41.386 "num_base_bdevs_operational": 4, 00:21:41.386 "base_bdevs_list": [ 00:21:41.386 { 00:21:41.386 "name": null, 00:21:41.386 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:41.386 "is_configured": false, 00:21:41.386 "data_offset": 2048, 00:21:41.386 "data_size": 63488 00:21:41.386 }, 00:21:41.386 { 00:21:41.386 "name": null, 00:21:41.386 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:41.386 "is_configured": false, 00:21:41.386 "data_offset": 2048, 00:21:41.386 "data_size": 63488 00:21:41.386 }, 00:21:41.386 { 00:21:41.386 "name": "BaseBdev3", 00:21:41.386 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:41.386 "is_configured": true, 00:21:41.386 "data_offset": 2048, 00:21:41.386 "data_size": 63488 00:21:41.386 }, 00:21:41.386 { 00:21:41.386 "name": "BaseBdev4", 00:21:41.386 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:41.386 "is_configured": true, 00:21:41.386 "data_offset": 2048, 00:21:41.386 "data_size": 63488 00:21:41.386 } 00:21:41.386 ] 00:21:41.386 }' 00:21:41.386 10:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.386 10:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:41.954 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.954 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:42.212 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:42.212 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:42.472 [2024-07-12 10:48:17.613191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.472 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:42.731 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.731 "name": "Existed_Raid", 00:21:42.731 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:42.731 "strip_size_kb": 0, 00:21:42.731 "state": "configuring", 00:21:42.731 "raid_level": "raid1", 00:21:42.731 "superblock": true, 00:21:42.731 "num_base_bdevs": 4, 00:21:42.731 "num_base_bdevs_discovered": 3, 00:21:42.731 "num_base_bdevs_operational": 4, 00:21:42.731 "base_bdevs_list": [ 00:21:42.731 { 00:21:42.731 "name": null, 00:21:42.731 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:42.731 "is_configured": false, 00:21:42.731 "data_offset": 2048, 00:21:42.731 "data_size": 63488 00:21:42.731 }, 00:21:42.731 { 00:21:42.731 "name": "BaseBdev2", 00:21:42.731 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:42.731 "is_configured": true, 00:21:42.731 "data_offset": 2048, 00:21:42.731 "data_size": 63488 00:21:42.731 }, 00:21:42.731 { 00:21:42.731 "name": "BaseBdev3", 00:21:42.731 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:42.731 "is_configured": true, 00:21:42.731 "data_offset": 2048, 00:21:42.731 "data_size": 63488 00:21:42.731 }, 00:21:42.731 { 00:21:42.731 "name": "BaseBdev4", 00:21:42.731 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:42.731 "is_configured": true, 00:21:42.731 "data_offset": 2048, 00:21:42.731 "data_size": 63488 00:21:42.731 } 00:21:42.731 ] 00:21:42.731 }' 00:21:42.731 10:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.731 10:48:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:43.298 10:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.298 10:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:43.557 10:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:43.557 10:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.557 10:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:43.816 10:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u daf97543-16b7-4649-9352-20df7da034e4 00:21:44.075 [2024-07-12 10:48:19.149844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:44.075 [2024-07-12 10:48:19.150018] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1725180 00:21:44.075 [2024-07-12 10:48:19.150031] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:44.075 [2024-07-12 10:48:19.150208] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1725c20 00:21:44.075 [2024-07-12 10:48:19.150340] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1725180 00:21:44.075 [2024-07-12 10:48:19.150351] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1725180 00:21:44.075 [2024-07-12 10:48:19.150448] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:44.075 NewBaseBdev 00:21:44.075 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:44.075 10:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:44.075 10:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:44.075 10:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:44.075 10:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:44.075 10:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:44.075 10:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:44.333 10:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:44.592 [ 00:21:44.592 { 00:21:44.592 "name": "NewBaseBdev", 00:21:44.592 "aliases": [ 00:21:44.592 "daf97543-16b7-4649-9352-20df7da034e4" 00:21:44.592 ], 00:21:44.592 "product_name": "Malloc disk", 00:21:44.592 "block_size": 512, 00:21:44.592 "num_blocks": 65536, 00:21:44.592 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:44.592 "assigned_rate_limits": { 00:21:44.592 "rw_ios_per_sec": 0, 00:21:44.592 "rw_mbytes_per_sec": 0, 00:21:44.592 "r_mbytes_per_sec": 0, 00:21:44.592 "w_mbytes_per_sec": 0 00:21:44.592 }, 00:21:44.592 "claimed": true, 00:21:44.592 "claim_type": "exclusive_write", 00:21:44.592 "zoned": false, 00:21:44.592 "supported_io_types": { 00:21:44.592 "read": true, 00:21:44.592 "write": true, 00:21:44.592 "unmap": true, 00:21:44.592 "flush": true, 00:21:44.592 "reset": true, 00:21:44.592 "nvme_admin": false, 00:21:44.592 "nvme_io": false, 00:21:44.592 "nvme_io_md": false, 00:21:44.592 "write_zeroes": true, 00:21:44.592 "zcopy": true, 00:21:44.592 "get_zone_info": false, 00:21:44.592 "zone_management": false, 00:21:44.592 "zone_append": false, 00:21:44.592 "compare": false, 00:21:44.592 "compare_and_write": false, 00:21:44.592 "abort": true, 00:21:44.592 "seek_hole": false, 00:21:44.592 "seek_data": false, 00:21:44.592 "copy": true, 00:21:44.592 "nvme_iov_md": false 00:21:44.592 }, 00:21:44.592 "memory_domains": [ 00:21:44.592 { 00:21:44.592 "dma_device_id": "system", 00:21:44.592 "dma_device_type": 1 00:21:44.592 }, 00:21:44.592 { 00:21:44.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.592 "dma_device_type": 2 00:21:44.592 } 00:21:44.592 ], 00:21:44.592 "driver_specific": {} 00:21:44.592 } 00:21:44.592 ] 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.592 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:44.888 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:44.888 "name": "Existed_Raid", 00:21:44.888 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:44.888 "strip_size_kb": 0, 00:21:44.888 "state": "online", 00:21:44.888 "raid_level": "raid1", 00:21:44.888 "superblock": true, 00:21:44.888 "num_base_bdevs": 4, 00:21:44.888 "num_base_bdevs_discovered": 4, 00:21:44.888 "num_base_bdevs_operational": 4, 00:21:44.888 "base_bdevs_list": [ 00:21:44.888 { 00:21:44.888 "name": "NewBaseBdev", 00:21:44.888 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:44.888 "is_configured": true, 00:21:44.888 "data_offset": 2048, 00:21:44.888 "data_size": 63488 00:21:44.888 }, 00:21:44.888 { 00:21:44.888 "name": "BaseBdev2", 00:21:44.888 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:44.888 "is_configured": true, 00:21:44.888 "data_offset": 2048, 00:21:44.888 "data_size": 63488 00:21:44.888 }, 00:21:44.888 { 00:21:44.888 "name": "BaseBdev3", 00:21:44.888 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:44.888 "is_configured": true, 00:21:44.888 "data_offset": 2048, 00:21:44.888 "data_size": 63488 00:21:44.888 }, 00:21:44.888 { 00:21:44.888 "name": "BaseBdev4", 00:21:44.888 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:44.888 "is_configured": true, 00:21:44.888 "data_offset": 2048, 00:21:44.888 "data_size": 63488 00:21:44.888 } 00:21:44.888 ] 00:21:44.888 }' 00:21:44.888 10:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:44.888 10:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:45.491 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:45.491 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:45.491 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:45.491 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:45.491 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:45.491 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:45.491 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:45.491 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:45.750 [2024-07-12 10:48:20.714339] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:45.750 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:45.750 "name": "Existed_Raid", 00:21:45.750 "aliases": [ 00:21:45.750 "527af40b-50ec-470f-bc2a-21a006667666" 00:21:45.750 ], 00:21:45.750 "product_name": "Raid Volume", 00:21:45.750 "block_size": 512, 00:21:45.750 "num_blocks": 63488, 00:21:45.750 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:45.750 "assigned_rate_limits": { 00:21:45.750 "rw_ios_per_sec": 0, 00:21:45.750 "rw_mbytes_per_sec": 0, 00:21:45.750 "r_mbytes_per_sec": 0, 00:21:45.750 "w_mbytes_per_sec": 0 00:21:45.750 }, 00:21:45.750 "claimed": false, 00:21:45.750 "zoned": false, 00:21:45.750 "supported_io_types": { 00:21:45.750 "read": true, 00:21:45.750 "write": true, 00:21:45.750 "unmap": false, 00:21:45.750 "flush": false, 00:21:45.750 "reset": true, 00:21:45.750 "nvme_admin": false, 00:21:45.750 "nvme_io": false, 00:21:45.750 "nvme_io_md": false, 00:21:45.750 "write_zeroes": true, 00:21:45.750 "zcopy": false, 00:21:45.750 "get_zone_info": false, 00:21:45.750 "zone_management": false, 00:21:45.750 "zone_append": false, 00:21:45.750 "compare": false, 00:21:45.750 "compare_and_write": false, 00:21:45.750 "abort": false, 00:21:45.750 "seek_hole": false, 00:21:45.750 "seek_data": false, 00:21:45.750 "copy": false, 00:21:45.750 "nvme_iov_md": false 00:21:45.750 }, 00:21:45.750 "memory_domains": [ 00:21:45.750 { 00:21:45.750 "dma_device_id": "system", 00:21:45.750 "dma_device_type": 1 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.750 "dma_device_type": 2 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "dma_device_id": "system", 00:21:45.750 "dma_device_type": 1 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.750 "dma_device_type": 2 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "dma_device_id": "system", 00:21:45.750 "dma_device_type": 1 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.750 "dma_device_type": 2 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "dma_device_id": "system", 00:21:45.750 "dma_device_type": 1 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.750 "dma_device_type": 2 00:21:45.750 } 00:21:45.750 ], 00:21:45.750 "driver_specific": { 00:21:45.750 "raid": { 00:21:45.750 "uuid": "527af40b-50ec-470f-bc2a-21a006667666", 00:21:45.750 "strip_size_kb": 0, 00:21:45.750 "state": "online", 00:21:45.750 "raid_level": "raid1", 00:21:45.750 "superblock": true, 00:21:45.750 "num_base_bdevs": 4, 00:21:45.750 "num_base_bdevs_discovered": 4, 00:21:45.750 "num_base_bdevs_operational": 4, 00:21:45.750 "base_bdevs_list": [ 00:21:45.750 { 00:21:45.750 "name": "NewBaseBdev", 00:21:45.750 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:45.750 "is_configured": true, 00:21:45.750 "data_offset": 2048, 00:21:45.750 "data_size": 63488 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "name": "BaseBdev2", 00:21:45.750 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:45.750 "is_configured": true, 00:21:45.750 "data_offset": 2048, 00:21:45.750 "data_size": 63488 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "name": "BaseBdev3", 00:21:45.750 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:45.750 "is_configured": true, 00:21:45.750 "data_offset": 2048, 00:21:45.750 "data_size": 63488 00:21:45.750 }, 00:21:45.750 { 00:21:45.750 "name": "BaseBdev4", 00:21:45.750 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:45.750 "is_configured": true, 00:21:45.750 "data_offset": 2048, 00:21:45.750 "data_size": 63488 00:21:45.750 } 00:21:45.750 ] 00:21:45.750 } 00:21:45.750 } 00:21:45.750 }' 00:21:45.750 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:45.751 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:45.751 BaseBdev2 00:21:45.751 BaseBdev3 00:21:45.751 BaseBdev4' 00:21:45.751 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:45.751 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:45.751 10:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:46.009 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:46.009 "name": "NewBaseBdev", 00:21:46.009 "aliases": [ 00:21:46.009 "daf97543-16b7-4649-9352-20df7da034e4" 00:21:46.009 ], 00:21:46.009 "product_name": "Malloc disk", 00:21:46.009 "block_size": 512, 00:21:46.009 "num_blocks": 65536, 00:21:46.009 "uuid": "daf97543-16b7-4649-9352-20df7da034e4", 00:21:46.009 "assigned_rate_limits": { 00:21:46.009 "rw_ios_per_sec": 0, 00:21:46.009 "rw_mbytes_per_sec": 0, 00:21:46.009 "r_mbytes_per_sec": 0, 00:21:46.009 "w_mbytes_per_sec": 0 00:21:46.009 }, 00:21:46.009 "claimed": true, 00:21:46.009 "claim_type": "exclusive_write", 00:21:46.009 "zoned": false, 00:21:46.010 "supported_io_types": { 00:21:46.010 "read": true, 00:21:46.010 "write": true, 00:21:46.010 "unmap": true, 00:21:46.010 "flush": true, 00:21:46.010 "reset": true, 00:21:46.010 "nvme_admin": false, 00:21:46.010 "nvme_io": false, 00:21:46.010 "nvme_io_md": false, 00:21:46.010 "write_zeroes": true, 00:21:46.010 "zcopy": true, 00:21:46.010 "get_zone_info": false, 00:21:46.010 "zone_management": false, 00:21:46.010 "zone_append": false, 00:21:46.010 "compare": false, 00:21:46.010 "compare_and_write": false, 00:21:46.010 "abort": true, 00:21:46.010 "seek_hole": false, 00:21:46.010 "seek_data": false, 00:21:46.010 "copy": true, 00:21:46.010 "nvme_iov_md": false 00:21:46.010 }, 00:21:46.010 "memory_domains": [ 00:21:46.010 { 00:21:46.010 "dma_device_id": "system", 00:21:46.010 "dma_device_type": 1 00:21:46.010 }, 00:21:46.010 { 00:21:46.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.010 "dma_device_type": 2 00:21:46.010 } 00:21:46.010 ], 00:21:46.010 "driver_specific": {} 00:21:46.010 }' 00:21:46.010 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.010 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.010 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:46.010 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.010 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:46.269 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:46.528 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:46.528 "name": "BaseBdev2", 00:21:46.528 "aliases": [ 00:21:46.528 "e4db50c2-d284-4907-bc0e-4821075866b6" 00:21:46.528 ], 00:21:46.528 "product_name": "Malloc disk", 00:21:46.528 "block_size": 512, 00:21:46.528 "num_blocks": 65536, 00:21:46.528 "uuid": "e4db50c2-d284-4907-bc0e-4821075866b6", 00:21:46.528 "assigned_rate_limits": { 00:21:46.528 "rw_ios_per_sec": 0, 00:21:46.528 "rw_mbytes_per_sec": 0, 00:21:46.528 "r_mbytes_per_sec": 0, 00:21:46.528 "w_mbytes_per_sec": 0 00:21:46.528 }, 00:21:46.528 "claimed": true, 00:21:46.528 "claim_type": "exclusive_write", 00:21:46.528 "zoned": false, 00:21:46.528 "supported_io_types": { 00:21:46.528 "read": true, 00:21:46.528 "write": true, 00:21:46.528 "unmap": true, 00:21:46.528 "flush": true, 00:21:46.528 "reset": true, 00:21:46.528 "nvme_admin": false, 00:21:46.528 "nvme_io": false, 00:21:46.528 "nvme_io_md": false, 00:21:46.528 "write_zeroes": true, 00:21:46.528 "zcopy": true, 00:21:46.528 "get_zone_info": false, 00:21:46.528 "zone_management": false, 00:21:46.528 "zone_append": false, 00:21:46.528 "compare": false, 00:21:46.528 "compare_and_write": false, 00:21:46.528 "abort": true, 00:21:46.528 "seek_hole": false, 00:21:46.528 "seek_data": false, 00:21:46.528 "copy": true, 00:21:46.528 "nvme_iov_md": false 00:21:46.528 }, 00:21:46.528 "memory_domains": [ 00:21:46.528 { 00:21:46.528 "dma_device_id": "system", 00:21:46.528 "dma_device_type": 1 00:21:46.528 }, 00:21:46.528 { 00:21:46.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.528 "dma_device_type": 2 00:21:46.528 } 00:21:46.528 ], 00:21:46.528 "driver_specific": {} 00:21:46.528 }' 00:21:46.528 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.528 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.528 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:46.528 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:46.788 10:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:47.046 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:47.046 "name": "BaseBdev3", 00:21:47.046 "aliases": [ 00:21:47.046 "8f069435-076a-4b3e-80ce-169e1688b8cc" 00:21:47.046 ], 00:21:47.046 "product_name": "Malloc disk", 00:21:47.046 "block_size": 512, 00:21:47.046 "num_blocks": 65536, 00:21:47.046 "uuid": "8f069435-076a-4b3e-80ce-169e1688b8cc", 00:21:47.046 "assigned_rate_limits": { 00:21:47.046 "rw_ios_per_sec": 0, 00:21:47.046 "rw_mbytes_per_sec": 0, 00:21:47.046 "r_mbytes_per_sec": 0, 00:21:47.046 "w_mbytes_per_sec": 0 00:21:47.046 }, 00:21:47.046 "claimed": true, 00:21:47.046 "claim_type": "exclusive_write", 00:21:47.046 "zoned": false, 00:21:47.046 "supported_io_types": { 00:21:47.046 "read": true, 00:21:47.046 "write": true, 00:21:47.046 "unmap": true, 00:21:47.046 "flush": true, 00:21:47.047 "reset": true, 00:21:47.047 "nvme_admin": false, 00:21:47.047 "nvme_io": false, 00:21:47.047 "nvme_io_md": false, 00:21:47.047 "write_zeroes": true, 00:21:47.047 "zcopy": true, 00:21:47.047 "get_zone_info": false, 00:21:47.047 "zone_management": false, 00:21:47.047 "zone_append": false, 00:21:47.047 "compare": false, 00:21:47.047 "compare_and_write": false, 00:21:47.047 "abort": true, 00:21:47.047 "seek_hole": false, 00:21:47.047 "seek_data": false, 00:21:47.047 "copy": true, 00:21:47.047 "nvme_iov_md": false 00:21:47.047 }, 00:21:47.047 "memory_domains": [ 00:21:47.047 { 00:21:47.047 "dma_device_id": "system", 00:21:47.047 "dma_device_type": 1 00:21:47.047 }, 00:21:47.047 { 00:21:47.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.047 "dma_device_type": 2 00:21:47.047 } 00:21:47.047 ], 00:21:47.047 "driver_specific": {} 00:21:47.047 }' 00:21:47.047 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.305 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.305 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:47.305 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.305 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.305 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:47.305 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:47.305 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:47.305 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:47.305 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:47.564 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:47.564 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:47.564 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:47.564 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:47.564 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:47.822 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:47.822 "name": "BaseBdev4", 00:21:47.822 "aliases": [ 00:21:47.822 "01439852-ee0e-467f-b7fb-a779c57ce53e" 00:21:47.822 ], 00:21:47.822 "product_name": "Malloc disk", 00:21:47.822 "block_size": 512, 00:21:47.822 "num_blocks": 65536, 00:21:47.822 "uuid": "01439852-ee0e-467f-b7fb-a779c57ce53e", 00:21:47.822 "assigned_rate_limits": { 00:21:47.822 "rw_ios_per_sec": 0, 00:21:47.822 "rw_mbytes_per_sec": 0, 00:21:47.822 "r_mbytes_per_sec": 0, 00:21:47.822 "w_mbytes_per_sec": 0 00:21:47.822 }, 00:21:47.822 "claimed": true, 00:21:47.822 "claim_type": "exclusive_write", 00:21:47.822 "zoned": false, 00:21:47.822 "supported_io_types": { 00:21:47.822 "read": true, 00:21:47.822 "write": true, 00:21:47.822 "unmap": true, 00:21:47.822 "flush": true, 00:21:47.822 "reset": true, 00:21:47.822 "nvme_admin": false, 00:21:47.822 "nvme_io": false, 00:21:47.822 "nvme_io_md": false, 00:21:47.822 "write_zeroes": true, 00:21:47.822 "zcopy": true, 00:21:47.822 "get_zone_info": false, 00:21:47.822 "zone_management": false, 00:21:47.822 "zone_append": false, 00:21:47.822 "compare": false, 00:21:47.822 "compare_and_write": false, 00:21:47.822 "abort": true, 00:21:47.822 "seek_hole": false, 00:21:47.822 "seek_data": false, 00:21:47.822 "copy": true, 00:21:47.822 "nvme_iov_md": false 00:21:47.822 }, 00:21:47.822 "memory_domains": [ 00:21:47.822 { 00:21:47.822 "dma_device_id": "system", 00:21:47.822 "dma_device_type": 1 00:21:47.822 }, 00:21:47.822 { 00:21:47.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.822 "dma_device_type": 2 00:21:47.822 } 00:21:47.822 ], 00:21:47.822 "driver_specific": {} 00:21:47.822 }' 00:21:47.822 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.822 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.822 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:47.822 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.822 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.822 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:47.822 10:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.079 10:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.079 10:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:48.079 10:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.079 10:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.079 10:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:48.079 10:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:48.337 [2024-07-12 10:48:23.389133] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:48.337 [2024-07-12 10:48:23.389163] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:48.337 [2024-07-12 10:48:23.389214] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:48.337 [2024-07-12 10:48:23.389509] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:48.337 [2024-07-12 10:48:23.389522] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1725180 name Existed_Raid, state offline 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2109864 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2109864 ']' 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2109864 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2109864 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2109864' 00:21:48.337 killing process with pid 2109864 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2109864 00:21:48.337 [2024-07-12 10:48:23.455041] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:48.337 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2109864 00:21:48.337 [2024-07-12 10:48:23.491415] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:48.596 10:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:48.596 00:21:48.596 real 0m32.175s 00:21:48.596 user 0m59.101s 00:21:48.596 sys 0m5.787s 00:21:48.596 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:48.596 10:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:48.596 ************************************ 00:21:48.596 END TEST raid_state_function_test_sb 00:21:48.596 ************************************ 00:21:48.596 10:48:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:48.596 10:48:23 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:21:48.596 10:48:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:48.596 10:48:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:48.596 10:48:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:48.596 ************************************ 00:21:48.596 START TEST raid_superblock_test 00:21:48.596 ************************************ 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2115248 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2115248 /var/tmp/spdk-raid.sock 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2115248 ']' 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:48.596 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:48.596 10:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:48.854 [2024-07-12 10:48:23.842056] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:21:48.854 [2024-07-12 10:48:23.842120] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2115248 ] 00:21:48.854 [2024-07-12 10:48:23.971884] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:49.112 [2024-07-12 10:48:24.079574] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:49.112 [2024-07-12 10:48:24.144151] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:49.112 [2024-07-12 10:48:24.144189] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:49.678 10:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:49.936 malloc1 00:21:49.936 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:50.194 [2024-07-12 10:48:25.249493] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:50.194 [2024-07-12 10:48:25.249540] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:50.194 [2024-07-12 10:48:25.249561] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdef570 00:21:50.194 [2024-07-12 10:48:25.249574] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:50.194 [2024-07-12 10:48:25.251273] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:50.194 [2024-07-12 10:48:25.251302] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:50.194 pt1 00:21:50.194 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:50.194 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:50.194 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:50.194 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:50.194 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:50.194 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:50.194 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:50.194 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:50.194 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:50.452 malloc2 00:21:50.452 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:50.710 [2024-07-12 10:48:25.735557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:50.710 [2024-07-12 10:48:25.735606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:50.710 [2024-07-12 10:48:25.735624] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf0970 00:21:50.710 [2024-07-12 10:48:25.735637] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:50.710 [2024-07-12 10:48:25.737271] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:50.710 [2024-07-12 10:48:25.737300] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:50.710 pt2 00:21:50.710 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:50.710 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:50.710 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:50.710 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:50.710 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:50.710 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:50.710 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:50.710 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:50.710 10:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:50.968 malloc3 00:21:50.968 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:51.226 [2024-07-12 10:48:26.222418] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:51.226 [2024-07-12 10:48:26.222464] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:51.226 [2024-07-12 10:48:26.222489] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf87340 00:21:51.226 [2024-07-12 10:48:26.222502] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:51.226 [2024-07-12 10:48:26.224045] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:51.226 [2024-07-12 10:48:26.224075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:51.226 pt3 00:21:51.226 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:51.226 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:51.226 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:21:51.226 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:21:51.226 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:51.226 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:51.226 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:51.226 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:51.226 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:51.484 malloc4 00:21:51.484 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:51.742 [2024-07-12 10:48:26.717553] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:51.742 [2024-07-12 10:48:26.717600] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:51.742 [2024-07-12 10:48:26.717622] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf89c60 00:21:51.742 [2024-07-12 10:48:26.717635] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:51.742 [2024-07-12 10:48:26.719181] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:51.742 [2024-07-12 10:48:26.719210] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:51.742 pt4 00:21:51.742 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:51.742 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:51.742 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:52.000 [2024-07-12 10:48:26.958211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:52.000 [2024-07-12 10:48:26.959538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:52.000 [2024-07-12 10:48:26.959592] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:52.000 [2024-07-12 10:48:26.959641] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:52.001 [2024-07-12 10:48:26.959813] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xde7530 00:21:52.001 [2024-07-12 10:48:26.959824] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:52.001 [2024-07-12 10:48:26.960020] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xde5770 00:21:52.001 [2024-07-12 10:48:26.960175] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xde7530 00:21:52.001 [2024-07-12 10:48:26.960186] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xde7530 00:21:52.001 [2024-07-12 10:48:26.960288] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.001 10:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.259 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.259 "name": "raid_bdev1", 00:21:52.259 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:21:52.259 "strip_size_kb": 0, 00:21:52.259 "state": "online", 00:21:52.259 "raid_level": "raid1", 00:21:52.259 "superblock": true, 00:21:52.259 "num_base_bdevs": 4, 00:21:52.259 "num_base_bdevs_discovered": 4, 00:21:52.259 "num_base_bdevs_operational": 4, 00:21:52.259 "base_bdevs_list": [ 00:21:52.259 { 00:21:52.259 "name": "pt1", 00:21:52.259 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:52.259 "is_configured": true, 00:21:52.259 "data_offset": 2048, 00:21:52.259 "data_size": 63488 00:21:52.259 }, 00:21:52.259 { 00:21:52.259 "name": "pt2", 00:21:52.259 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:52.259 "is_configured": true, 00:21:52.259 "data_offset": 2048, 00:21:52.259 "data_size": 63488 00:21:52.259 }, 00:21:52.259 { 00:21:52.259 "name": "pt3", 00:21:52.259 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:52.259 "is_configured": true, 00:21:52.259 "data_offset": 2048, 00:21:52.259 "data_size": 63488 00:21:52.259 }, 00:21:52.259 { 00:21:52.259 "name": "pt4", 00:21:52.259 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:52.259 "is_configured": true, 00:21:52.259 "data_offset": 2048, 00:21:52.259 "data_size": 63488 00:21:52.259 } 00:21:52.259 ] 00:21:52.259 }' 00:21:52.259 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.259 10:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:52.825 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:52.825 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:52.825 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:52.825 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:52.825 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:52.825 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:52.825 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:52.825 10:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:53.082 [2024-07-12 10:48:28.025329] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:53.082 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:53.082 "name": "raid_bdev1", 00:21:53.082 "aliases": [ 00:21:53.082 "c0354607-767d-4d9a-b436-5f84cd193e44" 00:21:53.082 ], 00:21:53.082 "product_name": "Raid Volume", 00:21:53.082 "block_size": 512, 00:21:53.082 "num_blocks": 63488, 00:21:53.082 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:21:53.082 "assigned_rate_limits": { 00:21:53.082 "rw_ios_per_sec": 0, 00:21:53.082 "rw_mbytes_per_sec": 0, 00:21:53.082 "r_mbytes_per_sec": 0, 00:21:53.082 "w_mbytes_per_sec": 0 00:21:53.082 }, 00:21:53.082 "claimed": false, 00:21:53.082 "zoned": false, 00:21:53.082 "supported_io_types": { 00:21:53.082 "read": true, 00:21:53.082 "write": true, 00:21:53.082 "unmap": false, 00:21:53.082 "flush": false, 00:21:53.082 "reset": true, 00:21:53.082 "nvme_admin": false, 00:21:53.082 "nvme_io": false, 00:21:53.082 "nvme_io_md": false, 00:21:53.082 "write_zeroes": true, 00:21:53.082 "zcopy": false, 00:21:53.082 "get_zone_info": false, 00:21:53.082 "zone_management": false, 00:21:53.082 "zone_append": false, 00:21:53.082 "compare": false, 00:21:53.082 "compare_and_write": false, 00:21:53.082 "abort": false, 00:21:53.082 "seek_hole": false, 00:21:53.082 "seek_data": false, 00:21:53.082 "copy": false, 00:21:53.082 "nvme_iov_md": false 00:21:53.082 }, 00:21:53.082 "memory_domains": [ 00:21:53.082 { 00:21:53.082 "dma_device_id": "system", 00:21:53.082 "dma_device_type": 1 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.082 "dma_device_type": 2 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "dma_device_id": "system", 00:21:53.082 "dma_device_type": 1 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.082 "dma_device_type": 2 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "dma_device_id": "system", 00:21:53.082 "dma_device_type": 1 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.082 "dma_device_type": 2 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "dma_device_id": "system", 00:21:53.082 "dma_device_type": 1 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.082 "dma_device_type": 2 00:21:53.082 } 00:21:53.082 ], 00:21:53.082 "driver_specific": { 00:21:53.082 "raid": { 00:21:53.082 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:21:53.082 "strip_size_kb": 0, 00:21:53.082 "state": "online", 00:21:53.082 "raid_level": "raid1", 00:21:53.082 "superblock": true, 00:21:53.082 "num_base_bdevs": 4, 00:21:53.082 "num_base_bdevs_discovered": 4, 00:21:53.082 "num_base_bdevs_operational": 4, 00:21:53.082 "base_bdevs_list": [ 00:21:53.082 { 00:21:53.082 "name": "pt1", 00:21:53.082 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:53.082 "is_configured": true, 00:21:53.082 "data_offset": 2048, 00:21:53.082 "data_size": 63488 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "name": "pt2", 00:21:53.082 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:53.082 "is_configured": true, 00:21:53.082 "data_offset": 2048, 00:21:53.082 "data_size": 63488 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "name": "pt3", 00:21:53.082 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:53.082 "is_configured": true, 00:21:53.082 "data_offset": 2048, 00:21:53.082 "data_size": 63488 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "name": "pt4", 00:21:53.082 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:53.082 "is_configured": true, 00:21:53.082 "data_offset": 2048, 00:21:53.082 "data_size": 63488 00:21:53.082 } 00:21:53.082 ] 00:21:53.082 } 00:21:53.082 } 00:21:53.082 }' 00:21:53.082 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:53.082 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:53.082 pt2 00:21:53.082 pt3 00:21:53.082 pt4' 00:21:53.082 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.082 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:53.082 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:53.338 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:53.338 "name": "pt1", 00:21:53.338 "aliases": [ 00:21:53.338 "00000000-0000-0000-0000-000000000001" 00:21:53.338 ], 00:21:53.338 "product_name": "passthru", 00:21:53.338 "block_size": 512, 00:21:53.338 "num_blocks": 65536, 00:21:53.338 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:53.338 "assigned_rate_limits": { 00:21:53.338 "rw_ios_per_sec": 0, 00:21:53.338 "rw_mbytes_per_sec": 0, 00:21:53.338 "r_mbytes_per_sec": 0, 00:21:53.338 "w_mbytes_per_sec": 0 00:21:53.338 }, 00:21:53.338 "claimed": true, 00:21:53.338 "claim_type": "exclusive_write", 00:21:53.338 "zoned": false, 00:21:53.338 "supported_io_types": { 00:21:53.338 "read": true, 00:21:53.338 "write": true, 00:21:53.338 "unmap": true, 00:21:53.338 "flush": true, 00:21:53.338 "reset": true, 00:21:53.338 "nvme_admin": false, 00:21:53.338 "nvme_io": false, 00:21:53.338 "nvme_io_md": false, 00:21:53.338 "write_zeroes": true, 00:21:53.338 "zcopy": true, 00:21:53.338 "get_zone_info": false, 00:21:53.338 "zone_management": false, 00:21:53.338 "zone_append": false, 00:21:53.338 "compare": false, 00:21:53.338 "compare_and_write": false, 00:21:53.338 "abort": true, 00:21:53.338 "seek_hole": false, 00:21:53.338 "seek_data": false, 00:21:53.338 "copy": true, 00:21:53.338 "nvme_iov_md": false 00:21:53.338 }, 00:21:53.338 "memory_domains": [ 00:21:53.338 { 00:21:53.338 "dma_device_id": "system", 00:21:53.338 "dma_device_type": 1 00:21:53.338 }, 00:21:53.338 { 00:21:53.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.338 "dma_device_type": 2 00:21:53.338 } 00:21:53.338 ], 00:21:53.338 "driver_specific": { 00:21:53.338 "passthru": { 00:21:53.338 "name": "pt1", 00:21:53.338 "base_bdev_name": "malloc1" 00:21:53.338 } 00:21:53.338 } 00:21:53.338 }' 00:21:53.338 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.338 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.338 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:53.338 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.338 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.338 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:53.338 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.595 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.595 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:53.595 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.595 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.595 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:53.595 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.595 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:53.595 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:53.853 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:53.853 "name": "pt2", 00:21:53.853 "aliases": [ 00:21:53.853 "00000000-0000-0000-0000-000000000002" 00:21:53.853 ], 00:21:53.853 "product_name": "passthru", 00:21:53.853 "block_size": 512, 00:21:53.853 "num_blocks": 65536, 00:21:53.853 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:53.853 "assigned_rate_limits": { 00:21:53.853 "rw_ios_per_sec": 0, 00:21:53.853 "rw_mbytes_per_sec": 0, 00:21:53.853 "r_mbytes_per_sec": 0, 00:21:53.853 "w_mbytes_per_sec": 0 00:21:53.853 }, 00:21:53.853 "claimed": true, 00:21:53.853 "claim_type": "exclusive_write", 00:21:53.853 "zoned": false, 00:21:53.853 "supported_io_types": { 00:21:53.853 "read": true, 00:21:53.853 "write": true, 00:21:53.853 "unmap": true, 00:21:53.853 "flush": true, 00:21:53.853 "reset": true, 00:21:53.853 "nvme_admin": false, 00:21:53.853 "nvme_io": false, 00:21:53.853 "nvme_io_md": false, 00:21:53.853 "write_zeroes": true, 00:21:53.853 "zcopy": true, 00:21:53.853 "get_zone_info": false, 00:21:53.853 "zone_management": false, 00:21:53.853 "zone_append": false, 00:21:53.853 "compare": false, 00:21:53.853 "compare_and_write": false, 00:21:53.853 "abort": true, 00:21:53.853 "seek_hole": false, 00:21:53.853 "seek_data": false, 00:21:53.853 "copy": true, 00:21:53.853 "nvme_iov_md": false 00:21:53.853 }, 00:21:53.853 "memory_domains": [ 00:21:53.853 { 00:21:53.853 "dma_device_id": "system", 00:21:53.853 "dma_device_type": 1 00:21:53.853 }, 00:21:53.853 { 00:21:53.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.853 "dma_device_type": 2 00:21:53.853 } 00:21:53.853 ], 00:21:53.853 "driver_specific": { 00:21:53.853 "passthru": { 00:21:53.853 "name": "pt2", 00:21:53.853 "base_bdev_name": "malloc2" 00:21:53.853 } 00:21:53.853 } 00:21:53.853 }' 00:21:53.853 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.853 10:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.853 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:53.853 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:54.111 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.370 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.370 "name": "pt3", 00:21:54.370 "aliases": [ 00:21:54.370 "00000000-0000-0000-0000-000000000003" 00:21:54.370 ], 00:21:54.370 "product_name": "passthru", 00:21:54.370 "block_size": 512, 00:21:54.370 "num_blocks": 65536, 00:21:54.370 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:54.370 "assigned_rate_limits": { 00:21:54.370 "rw_ios_per_sec": 0, 00:21:54.370 "rw_mbytes_per_sec": 0, 00:21:54.370 "r_mbytes_per_sec": 0, 00:21:54.370 "w_mbytes_per_sec": 0 00:21:54.370 }, 00:21:54.370 "claimed": true, 00:21:54.370 "claim_type": "exclusive_write", 00:21:54.370 "zoned": false, 00:21:54.370 "supported_io_types": { 00:21:54.370 "read": true, 00:21:54.370 "write": true, 00:21:54.370 "unmap": true, 00:21:54.370 "flush": true, 00:21:54.370 "reset": true, 00:21:54.370 "nvme_admin": false, 00:21:54.370 "nvme_io": false, 00:21:54.370 "nvme_io_md": false, 00:21:54.370 "write_zeroes": true, 00:21:54.370 "zcopy": true, 00:21:54.370 "get_zone_info": false, 00:21:54.370 "zone_management": false, 00:21:54.370 "zone_append": false, 00:21:54.370 "compare": false, 00:21:54.370 "compare_and_write": false, 00:21:54.370 "abort": true, 00:21:54.370 "seek_hole": false, 00:21:54.370 "seek_data": false, 00:21:54.370 "copy": true, 00:21:54.370 "nvme_iov_md": false 00:21:54.370 }, 00:21:54.370 "memory_domains": [ 00:21:54.370 { 00:21:54.370 "dma_device_id": "system", 00:21:54.370 "dma_device_type": 1 00:21:54.370 }, 00:21:54.370 { 00:21:54.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.370 "dma_device_type": 2 00:21:54.370 } 00:21:54.370 ], 00:21:54.370 "driver_specific": { 00:21:54.370 "passthru": { 00:21:54.370 "name": "pt3", 00:21:54.370 "base_bdev_name": "malloc3" 00:21:54.370 } 00:21:54.370 } 00:21:54.370 }' 00:21:54.370 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.628 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.628 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:54.628 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.628 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.628 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.628 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.628 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.628 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.628 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.886 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.886 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.886 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.886 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:54.886 10:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:55.144 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:55.144 "name": "pt4", 00:21:55.144 "aliases": [ 00:21:55.144 "00000000-0000-0000-0000-000000000004" 00:21:55.144 ], 00:21:55.144 "product_name": "passthru", 00:21:55.144 "block_size": 512, 00:21:55.144 "num_blocks": 65536, 00:21:55.144 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:55.144 "assigned_rate_limits": { 00:21:55.144 "rw_ios_per_sec": 0, 00:21:55.144 "rw_mbytes_per_sec": 0, 00:21:55.144 "r_mbytes_per_sec": 0, 00:21:55.144 "w_mbytes_per_sec": 0 00:21:55.144 }, 00:21:55.144 "claimed": true, 00:21:55.144 "claim_type": "exclusive_write", 00:21:55.144 "zoned": false, 00:21:55.144 "supported_io_types": { 00:21:55.144 "read": true, 00:21:55.144 "write": true, 00:21:55.144 "unmap": true, 00:21:55.145 "flush": true, 00:21:55.145 "reset": true, 00:21:55.145 "nvme_admin": false, 00:21:55.145 "nvme_io": false, 00:21:55.145 "nvme_io_md": false, 00:21:55.145 "write_zeroes": true, 00:21:55.145 "zcopy": true, 00:21:55.145 "get_zone_info": false, 00:21:55.145 "zone_management": false, 00:21:55.145 "zone_append": false, 00:21:55.145 "compare": false, 00:21:55.145 "compare_and_write": false, 00:21:55.145 "abort": true, 00:21:55.145 "seek_hole": false, 00:21:55.145 "seek_data": false, 00:21:55.145 "copy": true, 00:21:55.145 "nvme_iov_md": false 00:21:55.145 }, 00:21:55.145 "memory_domains": [ 00:21:55.145 { 00:21:55.145 "dma_device_id": "system", 00:21:55.145 "dma_device_type": 1 00:21:55.145 }, 00:21:55.145 { 00:21:55.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.145 "dma_device_type": 2 00:21:55.145 } 00:21:55.145 ], 00:21:55.145 "driver_specific": { 00:21:55.145 "passthru": { 00:21:55.145 "name": "pt4", 00:21:55.145 "base_bdev_name": "malloc4" 00:21:55.145 } 00:21:55.145 } 00:21:55.145 }' 00:21:55.145 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.145 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.145 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:55.145 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.145 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.145 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:55.145 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.145 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.403 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:55.403 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.403 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.403 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:55.403 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:55.403 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:55.662 [2024-07-12 10:48:30.696408] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:55.662 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c0354607-767d-4d9a-b436-5f84cd193e44 00:21:55.662 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c0354607-767d-4d9a-b436-5f84cd193e44 ']' 00:21:55.662 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:55.920 [2024-07-12 10:48:30.940745] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:55.920 [2024-07-12 10:48:30.940765] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:55.920 [2024-07-12 10:48:30.940813] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:55.920 [2024-07-12 10:48:30.940896] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:55.920 [2024-07-12 10:48:30.940909] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde7530 name raid_bdev1, state offline 00:21:55.920 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.920 10:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:56.178 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:56.178 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:56.178 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:56.178 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:56.437 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:56.437 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:56.696 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:56.696 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:56.696 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:56.696 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:56.955 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:56.955 10:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:57.214 [2024-07-12 10:48:32.356433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:57.214 [2024-07-12 10:48:32.357812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:57.214 [2024-07-12 10:48:32.357855] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:57.214 [2024-07-12 10:48:32.357890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:57.214 [2024-07-12 10:48:32.357933] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:57.214 [2024-07-12 10:48:32.357972] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:57.214 [2024-07-12 10:48:32.357995] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:57.214 [2024-07-12 10:48:32.358018] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:57.214 [2024-07-12 10:48:32.358036] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:57.214 [2024-07-12 10:48:32.358058] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf92ff0 name raid_bdev1, state configuring 00:21:57.214 request: 00:21:57.214 { 00:21:57.214 "name": "raid_bdev1", 00:21:57.214 "raid_level": "raid1", 00:21:57.214 "base_bdevs": [ 00:21:57.214 "malloc1", 00:21:57.214 "malloc2", 00:21:57.214 "malloc3", 00:21:57.214 "malloc4" 00:21:57.214 ], 00:21:57.214 "superblock": false, 00:21:57.214 "method": "bdev_raid_create", 00:21:57.214 "req_id": 1 00:21:57.214 } 00:21:57.214 Got JSON-RPC error response 00:21:57.214 response: 00:21:57.214 { 00:21:57.214 "code": -17, 00:21:57.214 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:57.214 } 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.214 10:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:57.501 10:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:57.501 10:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:57.501 10:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:58.066 [2024-07-12 10:48:33.082281] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:58.066 [2024-07-12 10:48:33.082333] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.066 [2024-07-12 10:48:33.082355] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdef7a0 00:21:58.066 [2024-07-12 10:48:33.082368] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.066 [2024-07-12 10:48:33.083957] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.066 [2024-07-12 10:48:33.083987] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:58.066 [2024-07-12 10:48:33.084059] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:58.066 [2024-07-12 10:48:33.084085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:58.066 pt1 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.066 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.324 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.324 "name": "raid_bdev1", 00:21:58.324 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:21:58.324 "strip_size_kb": 0, 00:21:58.324 "state": "configuring", 00:21:58.324 "raid_level": "raid1", 00:21:58.324 "superblock": true, 00:21:58.324 "num_base_bdevs": 4, 00:21:58.324 "num_base_bdevs_discovered": 1, 00:21:58.324 "num_base_bdevs_operational": 4, 00:21:58.324 "base_bdevs_list": [ 00:21:58.325 { 00:21:58.325 "name": "pt1", 00:21:58.325 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:58.325 "is_configured": true, 00:21:58.325 "data_offset": 2048, 00:21:58.325 "data_size": 63488 00:21:58.325 }, 00:21:58.325 { 00:21:58.325 "name": null, 00:21:58.325 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:58.325 "is_configured": false, 00:21:58.325 "data_offset": 2048, 00:21:58.325 "data_size": 63488 00:21:58.325 }, 00:21:58.325 { 00:21:58.325 "name": null, 00:21:58.325 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:58.325 "is_configured": false, 00:21:58.325 "data_offset": 2048, 00:21:58.325 "data_size": 63488 00:21:58.325 }, 00:21:58.325 { 00:21:58.325 "name": null, 00:21:58.325 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:58.325 "is_configured": false, 00:21:58.325 "data_offset": 2048, 00:21:58.325 "data_size": 63488 00:21:58.325 } 00:21:58.325 ] 00:21:58.325 }' 00:21:58.325 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.325 10:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:58.992 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:21:58.992 10:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:58.992 [2024-07-12 10:48:34.145106] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:58.992 [2024-07-12 10:48:34.145160] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.992 [2024-07-12 10:48:34.145179] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf88940 00:21:58.992 [2024-07-12 10:48:34.145192] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.992 [2024-07-12 10:48:34.145536] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.992 [2024-07-12 10:48:34.145555] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:58.992 [2024-07-12 10:48:34.145616] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:58.992 [2024-07-12 10:48:34.145635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:58.992 pt2 00:21:58.992 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:59.251 [2024-07-12 10:48:34.389770] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.251 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.509 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.509 "name": "raid_bdev1", 00:21:59.509 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:21:59.509 "strip_size_kb": 0, 00:21:59.509 "state": "configuring", 00:21:59.509 "raid_level": "raid1", 00:21:59.509 "superblock": true, 00:21:59.509 "num_base_bdevs": 4, 00:21:59.509 "num_base_bdevs_discovered": 1, 00:21:59.509 "num_base_bdevs_operational": 4, 00:21:59.509 "base_bdevs_list": [ 00:21:59.509 { 00:21:59.509 "name": "pt1", 00:21:59.509 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:59.509 "is_configured": true, 00:21:59.509 "data_offset": 2048, 00:21:59.509 "data_size": 63488 00:21:59.509 }, 00:21:59.509 { 00:21:59.509 "name": null, 00:21:59.509 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:59.509 "is_configured": false, 00:21:59.509 "data_offset": 2048, 00:21:59.509 "data_size": 63488 00:21:59.509 }, 00:21:59.509 { 00:21:59.509 "name": null, 00:21:59.509 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:59.509 "is_configured": false, 00:21:59.509 "data_offset": 2048, 00:21:59.509 "data_size": 63488 00:21:59.509 }, 00:21:59.509 { 00:21:59.509 "name": null, 00:21:59.509 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:59.509 "is_configured": false, 00:21:59.509 "data_offset": 2048, 00:21:59.509 "data_size": 63488 00:21:59.509 } 00:21:59.509 ] 00:21:59.509 }' 00:21:59.509 10:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.509 10:48:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:00.076 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:00.076 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:00.076 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:00.335 [2024-07-12 10:48:35.312195] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:00.336 [2024-07-12 10:48:35.312246] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.336 [2024-07-12 10:48:35.312265] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde6060 00:22:00.336 [2024-07-12 10:48:35.312278] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.336 [2024-07-12 10:48:35.312621] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.336 [2024-07-12 10:48:35.312639] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:00.336 [2024-07-12 10:48:35.312702] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:00.336 [2024-07-12 10:48:35.312722] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:00.336 pt2 00:22:00.336 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:00.336 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:00.336 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:00.336 [2024-07-12 10:48:35.496683] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:00.336 [2024-07-12 10:48:35.496720] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.336 [2024-07-12 10:48:35.496738] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde88d0 00:22:00.336 [2024-07-12 10:48:35.496750] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.336 [2024-07-12 10:48:35.497040] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.336 [2024-07-12 10:48:35.497057] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:00.336 [2024-07-12 10:48:35.497110] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:00.336 [2024-07-12 10:48:35.497127] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:00.336 pt3 00:22:00.336 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:00.336 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:00.336 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:00.595 [2024-07-12 10:48:35.741339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:00.595 [2024-07-12 10:48:35.741377] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.595 [2024-07-12 10:48:35.741393] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde9b80 00:22:00.595 [2024-07-12 10:48:35.741404] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.595 [2024-07-12 10:48:35.741701] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.595 [2024-07-12 10:48:35.741720] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:00.595 [2024-07-12 10:48:35.741771] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:00.595 [2024-07-12 10:48:35.741789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:00.595 [2024-07-12 10:48:35.741906] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xde6780 00:22:00.595 [2024-07-12 10:48:35.741917] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:00.595 [2024-07-12 10:48:35.742086] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdebfa0 00:22:00.595 [2024-07-12 10:48:35.742217] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xde6780 00:22:00.595 [2024-07-12 10:48:35.742227] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xde6780 00:22:00.595 [2024-07-12 10:48:35.742319] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:00.595 pt4 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.595 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.855 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.855 "name": "raid_bdev1", 00:22:00.855 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:22:00.855 "strip_size_kb": 0, 00:22:00.855 "state": "online", 00:22:00.855 "raid_level": "raid1", 00:22:00.855 "superblock": true, 00:22:00.855 "num_base_bdevs": 4, 00:22:00.855 "num_base_bdevs_discovered": 4, 00:22:00.855 "num_base_bdevs_operational": 4, 00:22:00.855 "base_bdevs_list": [ 00:22:00.855 { 00:22:00.855 "name": "pt1", 00:22:00.855 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:00.855 "is_configured": true, 00:22:00.855 "data_offset": 2048, 00:22:00.855 "data_size": 63488 00:22:00.855 }, 00:22:00.855 { 00:22:00.855 "name": "pt2", 00:22:00.855 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:00.855 "is_configured": true, 00:22:00.855 "data_offset": 2048, 00:22:00.855 "data_size": 63488 00:22:00.855 }, 00:22:00.855 { 00:22:00.855 "name": "pt3", 00:22:00.855 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:00.855 "is_configured": true, 00:22:00.855 "data_offset": 2048, 00:22:00.855 "data_size": 63488 00:22:00.855 }, 00:22:00.855 { 00:22:00.855 "name": "pt4", 00:22:00.855 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:00.855 "is_configured": true, 00:22:00.855 "data_offset": 2048, 00:22:00.855 "data_size": 63488 00:22:00.855 } 00:22:00.855 ] 00:22:00.855 }' 00:22:00.855 10:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.855 10:48:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.792 10:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:01.792 10:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:01.792 10:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:01.792 10:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:01.792 10:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:01.792 10:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:01.792 10:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:01.792 10:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:02.052 [2024-07-12 10:48:37.045101] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:02.052 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:02.052 "name": "raid_bdev1", 00:22:02.052 "aliases": [ 00:22:02.052 "c0354607-767d-4d9a-b436-5f84cd193e44" 00:22:02.052 ], 00:22:02.052 "product_name": "Raid Volume", 00:22:02.052 "block_size": 512, 00:22:02.052 "num_blocks": 63488, 00:22:02.052 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:22:02.052 "assigned_rate_limits": { 00:22:02.052 "rw_ios_per_sec": 0, 00:22:02.052 "rw_mbytes_per_sec": 0, 00:22:02.052 "r_mbytes_per_sec": 0, 00:22:02.052 "w_mbytes_per_sec": 0 00:22:02.052 }, 00:22:02.052 "claimed": false, 00:22:02.052 "zoned": false, 00:22:02.052 "supported_io_types": { 00:22:02.052 "read": true, 00:22:02.052 "write": true, 00:22:02.052 "unmap": false, 00:22:02.052 "flush": false, 00:22:02.052 "reset": true, 00:22:02.052 "nvme_admin": false, 00:22:02.052 "nvme_io": false, 00:22:02.052 "nvme_io_md": false, 00:22:02.052 "write_zeroes": true, 00:22:02.052 "zcopy": false, 00:22:02.052 "get_zone_info": false, 00:22:02.052 "zone_management": false, 00:22:02.052 "zone_append": false, 00:22:02.052 "compare": false, 00:22:02.052 "compare_and_write": false, 00:22:02.052 "abort": false, 00:22:02.052 "seek_hole": false, 00:22:02.052 "seek_data": false, 00:22:02.052 "copy": false, 00:22:02.052 "nvme_iov_md": false 00:22:02.052 }, 00:22:02.052 "memory_domains": [ 00:22:02.052 { 00:22:02.052 "dma_device_id": "system", 00:22:02.052 "dma_device_type": 1 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.052 "dma_device_type": 2 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "dma_device_id": "system", 00:22:02.052 "dma_device_type": 1 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.052 "dma_device_type": 2 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "dma_device_id": "system", 00:22:02.052 "dma_device_type": 1 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.052 "dma_device_type": 2 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "dma_device_id": "system", 00:22:02.052 "dma_device_type": 1 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.052 "dma_device_type": 2 00:22:02.052 } 00:22:02.052 ], 00:22:02.052 "driver_specific": { 00:22:02.052 "raid": { 00:22:02.052 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:22:02.052 "strip_size_kb": 0, 00:22:02.052 "state": "online", 00:22:02.052 "raid_level": "raid1", 00:22:02.052 "superblock": true, 00:22:02.052 "num_base_bdevs": 4, 00:22:02.052 "num_base_bdevs_discovered": 4, 00:22:02.052 "num_base_bdevs_operational": 4, 00:22:02.052 "base_bdevs_list": [ 00:22:02.052 { 00:22:02.052 "name": "pt1", 00:22:02.052 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:02.052 "is_configured": true, 00:22:02.052 "data_offset": 2048, 00:22:02.052 "data_size": 63488 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "name": "pt2", 00:22:02.052 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:02.052 "is_configured": true, 00:22:02.052 "data_offset": 2048, 00:22:02.052 "data_size": 63488 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "name": "pt3", 00:22:02.052 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:02.052 "is_configured": true, 00:22:02.052 "data_offset": 2048, 00:22:02.052 "data_size": 63488 00:22:02.052 }, 00:22:02.052 { 00:22:02.052 "name": "pt4", 00:22:02.052 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:02.052 "is_configured": true, 00:22:02.052 "data_offset": 2048, 00:22:02.052 "data_size": 63488 00:22:02.052 } 00:22:02.052 ] 00:22:02.052 } 00:22:02.052 } 00:22:02.052 }' 00:22:02.052 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:02.052 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:02.052 pt2 00:22:02.052 pt3 00:22:02.052 pt4' 00:22:02.052 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:02.052 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:02.052 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:02.311 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:02.311 "name": "pt1", 00:22:02.311 "aliases": [ 00:22:02.311 "00000000-0000-0000-0000-000000000001" 00:22:02.311 ], 00:22:02.311 "product_name": "passthru", 00:22:02.311 "block_size": 512, 00:22:02.311 "num_blocks": 65536, 00:22:02.311 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:02.311 "assigned_rate_limits": { 00:22:02.311 "rw_ios_per_sec": 0, 00:22:02.312 "rw_mbytes_per_sec": 0, 00:22:02.312 "r_mbytes_per_sec": 0, 00:22:02.312 "w_mbytes_per_sec": 0 00:22:02.312 }, 00:22:02.312 "claimed": true, 00:22:02.312 "claim_type": "exclusive_write", 00:22:02.312 "zoned": false, 00:22:02.312 "supported_io_types": { 00:22:02.312 "read": true, 00:22:02.312 "write": true, 00:22:02.312 "unmap": true, 00:22:02.312 "flush": true, 00:22:02.312 "reset": true, 00:22:02.312 "nvme_admin": false, 00:22:02.312 "nvme_io": false, 00:22:02.312 "nvme_io_md": false, 00:22:02.312 "write_zeroes": true, 00:22:02.312 "zcopy": true, 00:22:02.312 "get_zone_info": false, 00:22:02.312 "zone_management": false, 00:22:02.312 "zone_append": false, 00:22:02.312 "compare": false, 00:22:02.312 "compare_and_write": false, 00:22:02.312 "abort": true, 00:22:02.312 "seek_hole": false, 00:22:02.312 "seek_data": false, 00:22:02.312 "copy": true, 00:22:02.312 "nvme_iov_md": false 00:22:02.312 }, 00:22:02.312 "memory_domains": [ 00:22:02.312 { 00:22:02.312 "dma_device_id": "system", 00:22:02.312 "dma_device_type": 1 00:22:02.312 }, 00:22:02.312 { 00:22:02.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.312 "dma_device_type": 2 00:22:02.312 } 00:22:02.312 ], 00:22:02.312 "driver_specific": { 00:22:02.312 "passthru": { 00:22:02.312 "name": "pt1", 00:22:02.312 "base_bdev_name": "malloc1" 00:22:02.312 } 00:22:02.312 } 00:22:02.312 }' 00:22:02.312 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.312 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.312 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:02.312 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.312 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:02.571 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:02.830 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:02.830 "name": "pt2", 00:22:02.830 "aliases": [ 00:22:02.830 "00000000-0000-0000-0000-000000000002" 00:22:02.830 ], 00:22:02.830 "product_name": "passthru", 00:22:02.830 "block_size": 512, 00:22:02.830 "num_blocks": 65536, 00:22:02.830 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:02.830 "assigned_rate_limits": { 00:22:02.830 "rw_ios_per_sec": 0, 00:22:02.830 "rw_mbytes_per_sec": 0, 00:22:02.830 "r_mbytes_per_sec": 0, 00:22:02.830 "w_mbytes_per_sec": 0 00:22:02.830 }, 00:22:02.830 "claimed": true, 00:22:02.830 "claim_type": "exclusive_write", 00:22:02.830 "zoned": false, 00:22:02.830 "supported_io_types": { 00:22:02.830 "read": true, 00:22:02.830 "write": true, 00:22:02.830 "unmap": true, 00:22:02.830 "flush": true, 00:22:02.830 "reset": true, 00:22:02.830 "nvme_admin": false, 00:22:02.830 "nvme_io": false, 00:22:02.830 "nvme_io_md": false, 00:22:02.830 "write_zeroes": true, 00:22:02.830 "zcopy": true, 00:22:02.830 "get_zone_info": false, 00:22:02.830 "zone_management": false, 00:22:02.830 "zone_append": false, 00:22:02.830 "compare": false, 00:22:02.830 "compare_and_write": false, 00:22:02.830 "abort": true, 00:22:02.830 "seek_hole": false, 00:22:02.830 "seek_data": false, 00:22:02.830 "copy": true, 00:22:02.830 "nvme_iov_md": false 00:22:02.830 }, 00:22:02.830 "memory_domains": [ 00:22:02.830 { 00:22:02.830 "dma_device_id": "system", 00:22:02.830 "dma_device_type": 1 00:22:02.830 }, 00:22:02.830 { 00:22:02.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.830 "dma_device_type": 2 00:22:02.830 } 00:22:02.830 ], 00:22:02.830 "driver_specific": { 00:22:02.830 "passthru": { 00:22:02.830 "name": "pt2", 00:22:02.830 "base_bdev_name": "malloc2" 00:22:02.830 } 00:22:02.830 } 00:22:02.830 }' 00:22:02.830 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.830 10:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.089 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:03.089 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.089 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.089 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:03.089 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.089 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.089 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:03.089 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.089 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.349 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:03.349 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:03.349 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:03.349 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:03.608 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:03.608 "name": "pt3", 00:22:03.608 "aliases": [ 00:22:03.608 "00000000-0000-0000-0000-000000000003" 00:22:03.608 ], 00:22:03.608 "product_name": "passthru", 00:22:03.608 "block_size": 512, 00:22:03.608 "num_blocks": 65536, 00:22:03.608 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:03.608 "assigned_rate_limits": { 00:22:03.608 "rw_ios_per_sec": 0, 00:22:03.608 "rw_mbytes_per_sec": 0, 00:22:03.608 "r_mbytes_per_sec": 0, 00:22:03.608 "w_mbytes_per_sec": 0 00:22:03.608 }, 00:22:03.608 "claimed": true, 00:22:03.608 "claim_type": "exclusive_write", 00:22:03.608 "zoned": false, 00:22:03.608 "supported_io_types": { 00:22:03.608 "read": true, 00:22:03.608 "write": true, 00:22:03.608 "unmap": true, 00:22:03.608 "flush": true, 00:22:03.608 "reset": true, 00:22:03.608 "nvme_admin": false, 00:22:03.608 "nvme_io": false, 00:22:03.608 "nvme_io_md": false, 00:22:03.608 "write_zeroes": true, 00:22:03.608 "zcopy": true, 00:22:03.608 "get_zone_info": false, 00:22:03.608 "zone_management": false, 00:22:03.608 "zone_append": false, 00:22:03.608 "compare": false, 00:22:03.608 "compare_and_write": false, 00:22:03.608 "abort": true, 00:22:03.608 "seek_hole": false, 00:22:03.608 "seek_data": false, 00:22:03.608 "copy": true, 00:22:03.608 "nvme_iov_md": false 00:22:03.608 }, 00:22:03.608 "memory_domains": [ 00:22:03.608 { 00:22:03.608 "dma_device_id": "system", 00:22:03.608 "dma_device_type": 1 00:22:03.608 }, 00:22:03.608 { 00:22:03.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.608 "dma_device_type": 2 00:22:03.608 } 00:22:03.608 ], 00:22:03.608 "driver_specific": { 00:22:03.608 "passthru": { 00:22:03.608 "name": "pt3", 00:22:03.608 "base_bdev_name": "malloc3" 00:22:03.608 } 00:22:03.608 } 00:22:03.608 }' 00:22:03.608 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.608 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.608 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:03.608 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.608 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.608 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:03.608 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.608 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.867 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:03.867 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.867 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.867 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:03.867 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:03.867 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:03.867 10:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:04.127 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:04.127 "name": "pt4", 00:22:04.127 "aliases": [ 00:22:04.127 "00000000-0000-0000-0000-000000000004" 00:22:04.127 ], 00:22:04.127 "product_name": "passthru", 00:22:04.127 "block_size": 512, 00:22:04.127 "num_blocks": 65536, 00:22:04.127 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:04.127 "assigned_rate_limits": { 00:22:04.127 "rw_ios_per_sec": 0, 00:22:04.127 "rw_mbytes_per_sec": 0, 00:22:04.127 "r_mbytes_per_sec": 0, 00:22:04.127 "w_mbytes_per_sec": 0 00:22:04.127 }, 00:22:04.127 "claimed": true, 00:22:04.127 "claim_type": "exclusive_write", 00:22:04.127 "zoned": false, 00:22:04.127 "supported_io_types": { 00:22:04.127 "read": true, 00:22:04.127 "write": true, 00:22:04.127 "unmap": true, 00:22:04.127 "flush": true, 00:22:04.127 "reset": true, 00:22:04.127 "nvme_admin": false, 00:22:04.127 "nvme_io": false, 00:22:04.127 "nvme_io_md": false, 00:22:04.127 "write_zeroes": true, 00:22:04.127 "zcopy": true, 00:22:04.127 "get_zone_info": false, 00:22:04.127 "zone_management": false, 00:22:04.127 "zone_append": false, 00:22:04.127 "compare": false, 00:22:04.127 "compare_and_write": false, 00:22:04.127 "abort": true, 00:22:04.127 "seek_hole": false, 00:22:04.127 "seek_data": false, 00:22:04.127 "copy": true, 00:22:04.127 "nvme_iov_md": false 00:22:04.127 }, 00:22:04.127 "memory_domains": [ 00:22:04.127 { 00:22:04.127 "dma_device_id": "system", 00:22:04.127 "dma_device_type": 1 00:22:04.127 }, 00:22:04.127 { 00:22:04.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.127 "dma_device_type": 2 00:22:04.127 } 00:22:04.127 ], 00:22:04.127 "driver_specific": { 00:22:04.127 "passthru": { 00:22:04.127 "name": "pt4", 00:22:04.127 "base_bdev_name": "malloc4" 00:22:04.127 } 00:22:04.127 } 00:22:04.127 }' 00:22:04.127 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.127 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.127 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:04.127 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.127 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.386 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:04.386 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.386 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.386 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:04.386 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.386 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.386 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:04.386 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:04.386 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:04.645 [2024-07-12 10:48:39.728232] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:04.645 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c0354607-767d-4d9a-b436-5f84cd193e44 '!=' c0354607-767d-4d9a-b436-5f84cd193e44 ']' 00:22:04.645 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:04.645 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:04.645 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:04.645 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:04.903 [2024-07-12 10:48:39.972611] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.903 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.904 10:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:05.161 10:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:05.161 "name": "raid_bdev1", 00:22:05.161 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:22:05.161 "strip_size_kb": 0, 00:22:05.161 "state": "online", 00:22:05.161 "raid_level": "raid1", 00:22:05.161 "superblock": true, 00:22:05.161 "num_base_bdevs": 4, 00:22:05.161 "num_base_bdevs_discovered": 3, 00:22:05.161 "num_base_bdevs_operational": 3, 00:22:05.161 "base_bdevs_list": [ 00:22:05.161 { 00:22:05.161 "name": null, 00:22:05.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:05.162 "is_configured": false, 00:22:05.162 "data_offset": 2048, 00:22:05.162 "data_size": 63488 00:22:05.162 }, 00:22:05.162 { 00:22:05.162 "name": "pt2", 00:22:05.162 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:05.162 "is_configured": true, 00:22:05.162 "data_offset": 2048, 00:22:05.162 "data_size": 63488 00:22:05.162 }, 00:22:05.162 { 00:22:05.162 "name": "pt3", 00:22:05.162 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:05.162 "is_configured": true, 00:22:05.162 "data_offset": 2048, 00:22:05.162 "data_size": 63488 00:22:05.162 }, 00:22:05.162 { 00:22:05.162 "name": "pt4", 00:22:05.162 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:05.162 "is_configured": true, 00:22:05.162 "data_offset": 2048, 00:22:05.162 "data_size": 63488 00:22:05.162 } 00:22:05.162 ] 00:22:05.162 }' 00:22:05.162 10:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:05.162 10:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.728 10:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:05.986 [2024-07-12 10:48:41.083540] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:05.986 [2024-07-12 10:48:41.083568] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:05.986 [2024-07-12 10:48:41.083625] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:05.986 [2024-07-12 10:48:41.083687] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:05.986 [2024-07-12 10:48:41.083699] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde6780 name raid_bdev1, state offline 00:22:05.986 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.986 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:06.244 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:06.244 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:06.244 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:06.244 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:06.244 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:06.502 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:06.502 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:06.502 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:06.761 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:06.761 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:06.761 10:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:07.020 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:07.020 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:07.020 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:07.020 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:07.020 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:07.280 [2024-07-12 10:48:42.286733] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:07.280 [2024-07-12 10:48:42.286782] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:07.280 [2024-07-12 10:48:42.286802] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf89700 00:22:07.280 [2024-07-12 10:48:42.286815] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:07.280 [2024-07-12 10:48:42.288455] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:07.280 [2024-07-12 10:48:42.288494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:07.280 [2024-07-12 10:48:42.288560] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:07.280 [2024-07-12 10:48:42.288588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:07.280 pt2 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.280 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.538 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.538 "name": "raid_bdev1", 00:22:07.538 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:22:07.538 "strip_size_kb": 0, 00:22:07.538 "state": "configuring", 00:22:07.538 "raid_level": "raid1", 00:22:07.538 "superblock": true, 00:22:07.538 "num_base_bdevs": 4, 00:22:07.538 "num_base_bdevs_discovered": 1, 00:22:07.538 "num_base_bdevs_operational": 3, 00:22:07.538 "base_bdevs_list": [ 00:22:07.538 { 00:22:07.538 "name": null, 00:22:07.538 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.538 "is_configured": false, 00:22:07.538 "data_offset": 2048, 00:22:07.538 "data_size": 63488 00:22:07.538 }, 00:22:07.538 { 00:22:07.538 "name": "pt2", 00:22:07.538 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:07.538 "is_configured": true, 00:22:07.538 "data_offset": 2048, 00:22:07.539 "data_size": 63488 00:22:07.539 }, 00:22:07.539 { 00:22:07.539 "name": null, 00:22:07.539 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:07.539 "is_configured": false, 00:22:07.539 "data_offset": 2048, 00:22:07.539 "data_size": 63488 00:22:07.539 }, 00:22:07.539 { 00:22:07.539 "name": null, 00:22:07.539 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:07.539 "is_configured": false, 00:22:07.539 "data_offset": 2048, 00:22:07.539 "data_size": 63488 00:22:07.539 } 00:22:07.539 ] 00:22:07.539 }' 00:22:07.539 10:48:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.539 10:48:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:08.473 [2024-07-12 10:48:43.602230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:08.473 [2024-07-12 10:48:43.602281] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:08.473 [2024-07-12 10:48:43.602304] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdefa10 00:22:08.473 [2024-07-12 10:48:43.602317] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:08.473 [2024-07-12 10:48:43.602663] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:08.473 [2024-07-12 10:48:43.602683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:08.473 [2024-07-12 10:48:43.602750] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:08.473 [2024-07-12 10:48:43.602770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:08.473 pt3 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.473 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:08.731 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.731 "name": "raid_bdev1", 00:22:08.731 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:22:08.731 "strip_size_kb": 0, 00:22:08.731 "state": "configuring", 00:22:08.731 "raid_level": "raid1", 00:22:08.731 "superblock": true, 00:22:08.731 "num_base_bdevs": 4, 00:22:08.731 "num_base_bdevs_discovered": 2, 00:22:08.731 "num_base_bdevs_operational": 3, 00:22:08.731 "base_bdevs_list": [ 00:22:08.731 { 00:22:08.731 "name": null, 00:22:08.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.731 "is_configured": false, 00:22:08.731 "data_offset": 2048, 00:22:08.731 "data_size": 63488 00:22:08.731 }, 00:22:08.731 { 00:22:08.731 "name": "pt2", 00:22:08.731 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:08.731 "is_configured": true, 00:22:08.731 "data_offset": 2048, 00:22:08.731 "data_size": 63488 00:22:08.731 }, 00:22:08.731 { 00:22:08.731 "name": "pt3", 00:22:08.731 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:08.731 "is_configured": true, 00:22:08.731 "data_offset": 2048, 00:22:08.731 "data_size": 63488 00:22:08.731 }, 00:22:08.731 { 00:22:08.731 "name": null, 00:22:08.731 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:08.731 "is_configured": false, 00:22:08.731 "data_offset": 2048, 00:22:08.731 "data_size": 63488 00:22:08.731 } 00:22:08.731 ] 00:22:08.731 }' 00:22:08.731 10:48:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.731 10:48:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:09.296 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:09.296 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:09.296 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:22:09.296 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:09.554 [2024-07-12 10:48:44.669074] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:09.554 [2024-07-12 10:48:44.669123] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.554 [2024-07-12 10:48:44.669142] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf92520 00:22:09.554 [2024-07-12 10:48:44.669156] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.554 [2024-07-12 10:48:44.669496] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.554 [2024-07-12 10:48:44.669515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:09.554 [2024-07-12 10:48:44.669575] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:09.554 [2024-07-12 10:48:44.669595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:09.554 [2024-07-12 10:48:44.669702] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xde6ea0 00:22:09.554 [2024-07-12 10:48:44.669712] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:09.554 [2024-07-12 10:48:44.669877] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdeb600 00:22:09.554 [2024-07-12 10:48:44.670007] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xde6ea0 00:22:09.554 [2024-07-12 10:48:44.670017] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xde6ea0 00:22:09.554 [2024-07-12 10:48:44.670110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.554 pt4 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.554 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.812 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.812 "name": "raid_bdev1", 00:22:09.812 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:22:09.812 "strip_size_kb": 0, 00:22:09.812 "state": "online", 00:22:09.812 "raid_level": "raid1", 00:22:09.812 "superblock": true, 00:22:09.812 "num_base_bdevs": 4, 00:22:09.812 "num_base_bdevs_discovered": 3, 00:22:09.812 "num_base_bdevs_operational": 3, 00:22:09.812 "base_bdevs_list": [ 00:22:09.812 { 00:22:09.812 "name": null, 00:22:09.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:09.812 "is_configured": false, 00:22:09.812 "data_offset": 2048, 00:22:09.812 "data_size": 63488 00:22:09.812 }, 00:22:09.812 { 00:22:09.812 "name": "pt2", 00:22:09.812 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:09.812 "is_configured": true, 00:22:09.812 "data_offset": 2048, 00:22:09.812 "data_size": 63488 00:22:09.812 }, 00:22:09.812 { 00:22:09.812 "name": "pt3", 00:22:09.812 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:09.812 "is_configured": true, 00:22:09.812 "data_offset": 2048, 00:22:09.812 "data_size": 63488 00:22:09.812 }, 00:22:09.812 { 00:22:09.812 "name": "pt4", 00:22:09.812 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:09.812 "is_configured": true, 00:22:09.812 "data_offset": 2048, 00:22:09.812 "data_size": 63488 00:22:09.812 } 00:22:09.812 ] 00:22:09.812 }' 00:22:09.812 10:48:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:09.812 10:48:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.379 10:48:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:10.637 [2024-07-12 10:48:45.776009] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:10.637 [2024-07-12 10:48:45.776035] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:10.637 [2024-07-12 10:48:45.776087] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:10.637 [2024-07-12 10:48:45.776155] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:10.637 [2024-07-12 10:48:45.776166] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde6ea0 name raid_bdev1, state offline 00:22:10.637 10:48:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.637 10:48:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:10.894 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:10.894 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:10.894 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:22:10.894 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:22:10.894 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:11.152 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:11.410 [2024-07-12 10:48:46.441742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:11.410 [2024-07-12 10:48:46.441788] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.410 [2024-07-12 10:48:46.441807] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf92520 00:22:11.410 [2024-07-12 10:48:46.441819] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.410 [2024-07-12 10:48:46.443470] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.410 [2024-07-12 10:48:46.443515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:11.410 [2024-07-12 10:48:46.443581] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:11.410 [2024-07-12 10:48:46.443608] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:11.410 [2024-07-12 10:48:46.443710] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:11.410 [2024-07-12 10:48:46.443723] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:11.410 [2024-07-12 10:48:46.443737] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde6060 name raid_bdev1, state configuring 00:22:11.410 [2024-07-12 10:48:46.443761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:11.410 [2024-07-12 10:48:46.443836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:11.410 pt1 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.410 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.669 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:11.669 "name": "raid_bdev1", 00:22:11.669 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:22:11.669 "strip_size_kb": 0, 00:22:11.669 "state": "configuring", 00:22:11.669 "raid_level": "raid1", 00:22:11.669 "superblock": true, 00:22:11.669 "num_base_bdevs": 4, 00:22:11.669 "num_base_bdevs_discovered": 2, 00:22:11.669 "num_base_bdevs_operational": 3, 00:22:11.669 "base_bdevs_list": [ 00:22:11.669 { 00:22:11.669 "name": null, 00:22:11.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.669 "is_configured": false, 00:22:11.669 "data_offset": 2048, 00:22:11.669 "data_size": 63488 00:22:11.669 }, 00:22:11.669 { 00:22:11.669 "name": "pt2", 00:22:11.669 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:11.669 "is_configured": true, 00:22:11.669 "data_offset": 2048, 00:22:11.669 "data_size": 63488 00:22:11.669 }, 00:22:11.669 { 00:22:11.669 "name": "pt3", 00:22:11.669 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:11.669 "is_configured": true, 00:22:11.669 "data_offset": 2048, 00:22:11.669 "data_size": 63488 00:22:11.669 }, 00:22:11.669 { 00:22:11.669 "name": null, 00:22:11.669 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:11.669 "is_configured": false, 00:22:11.669 "data_offset": 2048, 00:22:11.669 "data_size": 63488 00:22:11.669 } 00:22:11.669 ] 00:22:11.669 }' 00:22:11.669 10:48:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:11.669 10:48:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:12.235 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:22:12.235 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:12.494 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:22:12.494 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:12.753 [2024-07-12 10:48:47.693065] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:12.753 [2024-07-12 10:48:47.693115] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:12.753 [2024-07-12 10:48:47.693135] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde6310 00:22:12.753 [2024-07-12 10:48:47.693148] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:12.753 [2024-07-12 10:48:47.693495] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:12.753 [2024-07-12 10:48:47.693515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:12.753 [2024-07-12 10:48:47.693576] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:12.753 [2024-07-12 10:48:47.693598] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:12.753 [2024-07-12 10:48:47.693708] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xde9b40 00:22:12.753 [2024-07-12 10:48:47.693719] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:12.753 [2024-07-12 10:48:47.693891] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf89990 00:22:12.753 [2024-07-12 10:48:47.694019] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xde9b40 00:22:12.753 [2024-07-12 10:48:47.694029] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xde9b40 00:22:12.753 [2024-07-12 10:48:47.694122] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:12.753 pt4 00:22:12.753 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:12.753 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:12.753 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:12.753 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:12.753 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:12.753 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:12.753 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.753 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.754 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.754 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.754 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.754 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:13.056 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:13.056 "name": "raid_bdev1", 00:22:13.056 "uuid": "c0354607-767d-4d9a-b436-5f84cd193e44", 00:22:13.056 "strip_size_kb": 0, 00:22:13.056 "state": "online", 00:22:13.056 "raid_level": "raid1", 00:22:13.056 "superblock": true, 00:22:13.056 "num_base_bdevs": 4, 00:22:13.056 "num_base_bdevs_discovered": 3, 00:22:13.056 "num_base_bdevs_operational": 3, 00:22:13.056 "base_bdevs_list": [ 00:22:13.056 { 00:22:13.056 "name": null, 00:22:13.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.056 "is_configured": false, 00:22:13.056 "data_offset": 2048, 00:22:13.056 "data_size": 63488 00:22:13.056 }, 00:22:13.056 { 00:22:13.056 "name": "pt2", 00:22:13.056 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:13.056 "is_configured": true, 00:22:13.056 "data_offset": 2048, 00:22:13.056 "data_size": 63488 00:22:13.056 }, 00:22:13.056 { 00:22:13.056 "name": "pt3", 00:22:13.056 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:13.056 "is_configured": true, 00:22:13.056 "data_offset": 2048, 00:22:13.056 "data_size": 63488 00:22:13.056 }, 00:22:13.056 { 00:22:13.056 "name": "pt4", 00:22:13.056 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:13.056 "is_configured": true, 00:22:13.056 "data_offset": 2048, 00:22:13.056 "data_size": 63488 00:22:13.056 } 00:22:13.056 ] 00:22:13.056 }' 00:22:13.056 10:48:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:13.056 10:48:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:13.624 10:48:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:13.624 10:48:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:13.624 10:48:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:13.624 10:48:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:13.624 10:48:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:13.882 [2024-07-12 10:48:48.928635] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' c0354607-767d-4d9a-b436-5f84cd193e44 '!=' c0354607-767d-4d9a-b436-5f84cd193e44 ']' 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2115248 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2115248 ']' 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2115248 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2115248 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2115248' 00:22:13.882 killing process with pid 2115248 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2115248 00:22:13.882 [2024-07-12 10:48:48.984440] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:13.882 [2024-07-12 10:48:48.984508] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:13.882 [2024-07-12 10:48:48.984575] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:13.882 [2024-07-12 10:48:48.984588] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde9b40 name raid_bdev1, state offline 00:22:13.882 10:48:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2115248 00:22:13.882 [2024-07-12 10:48:49.021156] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:14.141 10:48:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:14.141 00:22:14.141 real 0m25.444s 00:22:14.141 user 0m46.606s 00:22:14.141 sys 0m4.512s 00:22:14.141 10:48:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:14.141 10:48:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.141 ************************************ 00:22:14.141 END TEST raid_superblock_test 00:22:14.141 ************************************ 00:22:14.141 10:48:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:14.141 10:48:49 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:22:14.141 10:48:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:14.141 10:48:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:14.141 10:48:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:14.141 ************************************ 00:22:14.141 START TEST raid_read_error_test 00:22:14.141 ************************************ 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.TZv1wssiKx 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2119066 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2119066 /var/tmp/spdk-raid.sock 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2119066 ']' 00:22:14.141 10:48:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:14.142 10:48:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:14.142 10:48:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:14.142 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:14.142 10:48:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:14.142 10:48:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.399 [2024-07-12 10:48:49.378860] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:22:14.399 [2024-07-12 10:48:49.378929] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2119066 ] 00:22:14.399 [2024-07-12 10:48:49.507129] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:14.657 [2024-07-12 10:48:49.613697] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:14.657 [2024-07-12 10:48:49.680710] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:14.657 [2024-07-12 10:48:49.680747] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:15.222 10:48:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:15.222 10:48:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:15.222 10:48:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:15.222 10:48:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:15.480 BaseBdev1_malloc 00:22:15.480 10:48:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:15.737 true 00:22:15.737 10:48:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:15.995 [2024-07-12 10:48:51.007935] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:15.995 [2024-07-12 10:48:51.007981] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:15.995 [2024-07-12 10:48:51.008004] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4f0d0 00:22:15.995 [2024-07-12 10:48:51.008017] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:15.995 [2024-07-12 10:48:51.009910] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:15.995 [2024-07-12 10:48:51.009943] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:15.995 BaseBdev1 00:22:15.995 10:48:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:15.995 10:48:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:16.253 BaseBdev2_malloc 00:22:16.253 10:48:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:16.511 true 00:22:16.511 10:48:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:16.769 [2024-07-12 10:48:51.742441] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:16.769 [2024-07-12 10:48:51.742495] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:16.769 [2024-07-12 10:48:51.742518] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc53910 00:22:16.769 [2024-07-12 10:48:51.742531] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:16.769 [2024-07-12 10:48:51.744154] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:16.769 [2024-07-12 10:48:51.744184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:16.769 BaseBdev2 00:22:16.769 10:48:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:16.769 10:48:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:17.027 BaseBdev3_malloc 00:22:17.027 10:48:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:17.284 true 00:22:17.284 10:48:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:17.284 [2024-07-12 10:48:52.466153] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:17.284 [2024-07-12 10:48:52.466200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:17.284 [2024-07-12 10:48:52.466222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc55bd0 00:22:17.284 [2024-07-12 10:48:52.466235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:17.284 [2024-07-12 10:48:52.467812] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:17.284 [2024-07-12 10:48:52.467844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:17.284 BaseBdev3 00:22:17.541 10:48:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:17.541 10:48:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:17.541 BaseBdev4_malloc 00:22:17.541 10:48:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:17.798 true 00:22:17.798 10:48:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:18.054 [2024-07-12 10:48:53.184629] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:18.054 [2024-07-12 10:48:53.184676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:18.054 [2024-07-12 10:48:53.184696] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc56aa0 00:22:18.054 [2024-07-12 10:48:53.184709] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:18.054 [2024-07-12 10:48:53.186314] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:18.054 [2024-07-12 10:48:53.186348] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:18.054 BaseBdev4 00:22:18.054 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:18.310 [2024-07-12 10:48:53.425293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:18.310 [2024-07-12 10:48:53.426659] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:18.310 [2024-07-12 10:48:53.426728] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:18.310 [2024-07-12 10:48:53.426790] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:18.310 [2024-07-12 10:48:53.427028] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc50c20 00:22:18.310 [2024-07-12 10:48:53.427039] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:18.310 [2024-07-12 10:48:53.427240] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaa5260 00:22:18.310 [2024-07-12 10:48:53.427396] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc50c20 00:22:18.310 [2024-07-12 10:48:53.427406] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc50c20 00:22:18.310 [2024-07-12 10:48:53.427525] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.310 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.568 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.568 "name": "raid_bdev1", 00:22:18.568 "uuid": "37ccc3eb-05ca-47a4-ba29-ad63ecf9be43", 00:22:18.568 "strip_size_kb": 0, 00:22:18.568 "state": "online", 00:22:18.568 "raid_level": "raid1", 00:22:18.568 "superblock": true, 00:22:18.568 "num_base_bdevs": 4, 00:22:18.568 "num_base_bdevs_discovered": 4, 00:22:18.568 "num_base_bdevs_operational": 4, 00:22:18.568 "base_bdevs_list": [ 00:22:18.568 { 00:22:18.568 "name": "BaseBdev1", 00:22:18.568 "uuid": "deb003a3-f169-5f2e-b27e-ba6faa11fd26", 00:22:18.568 "is_configured": true, 00:22:18.568 "data_offset": 2048, 00:22:18.568 "data_size": 63488 00:22:18.568 }, 00:22:18.568 { 00:22:18.568 "name": "BaseBdev2", 00:22:18.568 "uuid": "1e1367b9-57e6-53fd-b9b7-81180e8333ed", 00:22:18.568 "is_configured": true, 00:22:18.568 "data_offset": 2048, 00:22:18.568 "data_size": 63488 00:22:18.568 }, 00:22:18.568 { 00:22:18.568 "name": "BaseBdev3", 00:22:18.568 "uuid": "e9d0095f-f85f-5428-84e0-685549dd6b73", 00:22:18.568 "is_configured": true, 00:22:18.568 "data_offset": 2048, 00:22:18.568 "data_size": 63488 00:22:18.568 }, 00:22:18.568 { 00:22:18.568 "name": "BaseBdev4", 00:22:18.568 "uuid": "2b721666-5b10-50ee-a06a-ff5eb0f468e3", 00:22:18.568 "is_configured": true, 00:22:18.568 "data_offset": 2048, 00:22:18.568 "data_size": 63488 00:22:18.568 } 00:22:18.568 ] 00:22:18.568 }' 00:22:18.568 10:48:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.568 10:48:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:19.132 10:48:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:19.132 10:48:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:19.390 [2024-07-12 10:48:54.400151] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaa4c60 00:22:20.325 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.584 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.842 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.842 "name": "raid_bdev1", 00:22:20.842 "uuid": "37ccc3eb-05ca-47a4-ba29-ad63ecf9be43", 00:22:20.842 "strip_size_kb": 0, 00:22:20.842 "state": "online", 00:22:20.842 "raid_level": "raid1", 00:22:20.842 "superblock": true, 00:22:20.842 "num_base_bdevs": 4, 00:22:20.842 "num_base_bdevs_discovered": 4, 00:22:20.842 "num_base_bdevs_operational": 4, 00:22:20.842 "base_bdevs_list": [ 00:22:20.842 { 00:22:20.842 "name": "BaseBdev1", 00:22:20.842 "uuid": "deb003a3-f169-5f2e-b27e-ba6faa11fd26", 00:22:20.842 "is_configured": true, 00:22:20.842 "data_offset": 2048, 00:22:20.842 "data_size": 63488 00:22:20.842 }, 00:22:20.842 { 00:22:20.842 "name": "BaseBdev2", 00:22:20.842 "uuid": "1e1367b9-57e6-53fd-b9b7-81180e8333ed", 00:22:20.842 "is_configured": true, 00:22:20.842 "data_offset": 2048, 00:22:20.842 "data_size": 63488 00:22:20.842 }, 00:22:20.842 { 00:22:20.842 "name": "BaseBdev3", 00:22:20.842 "uuid": "e9d0095f-f85f-5428-84e0-685549dd6b73", 00:22:20.842 "is_configured": true, 00:22:20.842 "data_offset": 2048, 00:22:20.842 "data_size": 63488 00:22:20.842 }, 00:22:20.842 { 00:22:20.842 "name": "BaseBdev4", 00:22:20.842 "uuid": "2b721666-5b10-50ee-a06a-ff5eb0f468e3", 00:22:20.842 "is_configured": true, 00:22:20.842 "data_offset": 2048, 00:22:20.842 "data_size": 63488 00:22:20.842 } 00:22:20.842 ] 00:22:20.842 }' 00:22:20.842 10:48:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.842 10:48:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:21.408 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:21.666 [2024-07-12 10:48:56.623870] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:21.666 [2024-07-12 10:48:56.623909] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:21.666 [2024-07-12 10:48:56.627048] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:21.666 [2024-07-12 10:48:56.627087] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:21.666 [2024-07-12 10:48:56.627205] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:21.666 [2024-07-12 10:48:56.627222] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc50c20 name raid_bdev1, state offline 00:22:21.666 0 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2119066 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2119066 ']' 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2119066 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2119066 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2119066' 00:22:21.666 killing process with pid 2119066 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2119066 00:22:21.666 [2024-07-12 10:48:56.689787] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:21.666 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2119066 00:22:21.666 [2024-07-12 10:48:56.720410] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.TZv1wssiKx 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:21.925 00:22:21.925 real 0m7.636s 00:22:21.925 user 0m12.281s 00:22:21.925 sys 0m1.314s 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:21.925 10:48:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:21.925 ************************************ 00:22:21.925 END TEST raid_read_error_test 00:22:21.925 ************************************ 00:22:21.925 10:48:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:21.925 10:48:56 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:22:21.925 10:48:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:21.925 10:48:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:21.925 10:48:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:21.925 ************************************ 00:22:21.925 START TEST raid_write_error_test 00:22:21.925 ************************************ 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Zr9wQ5i6IB 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2120088 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2120088 /var/tmp/spdk-raid.sock 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2120088 ']' 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:21.925 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:21.925 10:48:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:22.183 [2024-07-12 10:48:57.146372] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:22:22.183 [2024-07-12 10:48:57.146523] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2120088 ] 00:22:22.183 [2024-07-12 10:48:57.332863] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:22.441 [2024-07-12 10:48:57.436605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:22.441 [2024-07-12 10:48:57.502895] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:22.441 [2024-07-12 10:48:57.502940] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:23.007 10:48:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:23.007 10:48:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:23.007 10:48:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:23.007 10:48:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:23.007 BaseBdev1_malloc 00:22:23.007 10:48:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:23.265 true 00:22:23.265 10:48:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:23.523 [2024-07-12 10:48:58.633843] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:23.523 [2024-07-12 10:48:58.633889] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:23.523 [2024-07-12 10:48:58.633909] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f00d0 00:22:23.523 [2024-07-12 10:48:58.633921] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:23.523 [2024-07-12 10:48:58.635770] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:23.523 [2024-07-12 10:48:58.635801] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:23.523 BaseBdev1 00:22:23.523 10:48:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:23.523 10:48:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:23.782 BaseBdev2_malloc 00:22:23.782 10:48:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:24.040 true 00:22:24.041 10:48:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:24.299 [2024-07-12 10:48:59.268106] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:24.299 [2024-07-12 10:48:59.268149] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:24.299 [2024-07-12 10:48:59.268171] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f4910 00:22:24.299 [2024-07-12 10:48:59.268184] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:24.299 [2024-07-12 10:48:59.269734] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:24.299 [2024-07-12 10:48:59.269762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:24.299 BaseBdev2 00:22:24.299 10:48:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:24.299 10:48:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:24.557 BaseBdev3_malloc 00:22:24.557 10:48:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:24.815 true 00:22:25.074 10:49:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:25.074 [2024-07-12 10:49:00.256050] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:25.074 [2024-07-12 10:49:00.256100] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:25.074 [2024-07-12 10:49:00.256123] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f6bd0 00:22:25.074 [2024-07-12 10:49:00.256136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:25.074 [2024-07-12 10:49:00.257805] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:25.074 [2024-07-12 10:49:00.257837] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:25.074 BaseBdev3 00:22:25.332 10:49:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:25.332 10:49:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:25.332 BaseBdev4_malloc 00:22:25.332 10:49:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:25.591 true 00:22:25.591 10:49:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:25.850 [2024-07-12 10:49:00.827130] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:25.850 [2024-07-12 10:49:00.827171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:25.850 [2024-07-12 10:49:00.827192] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f7aa0 00:22:25.850 [2024-07-12 10:49:00.827205] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:25.850 [2024-07-12 10:49:00.828764] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:25.850 [2024-07-12 10:49:00.828793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:25.850 BaseBdev4 00:22:25.850 10:49:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:26.107 [2024-07-12 10:49:01.055762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:26.107 [2024-07-12 10:49:01.057089] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:26.108 [2024-07-12 10:49:01.057158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:26.108 [2024-07-12 10:49:01.057219] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:26.108 [2024-07-12 10:49:01.057454] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16f1c20 00:22:26.108 [2024-07-12 10:49:01.057467] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:26.108 [2024-07-12 10:49:01.057674] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1546260 00:22:26.108 [2024-07-12 10:49:01.057830] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16f1c20 00:22:26.108 [2024-07-12 10:49:01.057841] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16f1c20 00:22:26.108 [2024-07-12 10:49:01.057948] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.108 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:26.366 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:26.366 "name": "raid_bdev1", 00:22:26.366 "uuid": "fb975030-7d86-40cc-bff0-1adfaa074b3b", 00:22:26.366 "strip_size_kb": 0, 00:22:26.366 "state": "online", 00:22:26.366 "raid_level": "raid1", 00:22:26.366 "superblock": true, 00:22:26.366 "num_base_bdevs": 4, 00:22:26.366 "num_base_bdevs_discovered": 4, 00:22:26.366 "num_base_bdevs_operational": 4, 00:22:26.366 "base_bdevs_list": [ 00:22:26.366 { 00:22:26.366 "name": "BaseBdev1", 00:22:26.366 "uuid": "1a38c948-f6a3-5e3f-9a2b-f702418dd2b7", 00:22:26.366 "is_configured": true, 00:22:26.366 "data_offset": 2048, 00:22:26.366 "data_size": 63488 00:22:26.366 }, 00:22:26.366 { 00:22:26.366 "name": "BaseBdev2", 00:22:26.366 "uuid": "7be2c657-301a-5aa3-a559-b956ba53dd5f", 00:22:26.366 "is_configured": true, 00:22:26.366 "data_offset": 2048, 00:22:26.366 "data_size": 63488 00:22:26.366 }, 00:22:26.366 { 00:22:26.366 "name": "BaseBdev3", 00:22:26.366 "uuid": "d7794971-ddae-591d-9366-d257f595e29e", 00:22:26.366 "is_configured": true, 00:22:26.366 "data_offset": 2048, 00:22:26.366 "data_size": 63488 00:22:26.366 }, 00:22:26.366 { 00:22:26.366 "name": "BaseBdev4", 00:22:26.366 "uuid": "db7dedd4-c9bf-5804-8c4f-d350fe18f48b", 00:22:26.366 "is_configured": true, 00:22:26.366 "data_offset": 2048, 00:22:26.366 "data_size": 63488 00:22:26.366 } 00:22:26.366 ] 00:22:26.366 }' 00:22:26.366 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:26.366 10:49:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:26.933 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:26.933 10:49:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:26.933 [2024-07-12 10:49:02.038640] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1545c60 00:22:27.924 10:49:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:28.182 [2024-07-12 10:49:03.129453] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:22:28.182 [2024-07-12 10:49:03.129512] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:28.182 [2024-07-12 10:49:03.129725] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1545c60 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:28.182 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.183 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.441 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:28.441 "name": "raid_bdev1", 00:22:28.441 "uuid": "fb975030-7d86-40cc-bff0-1adfaa074b3b", 00:22:28.441 "strip_size_kb": 0, 00:22:28.441 "state": "online", 00:22:28.441 "raid_level": "raid1", 00:22:28.441 "superblock": true, 00:22:28.441 "num_base_bdevs": 4, 00:22:28.441 "num_base_bdevs_discovered": 3, 00:22:28.441 "num_base_bdevs_operational": 3, 00:22:28.441 "base_bdevs_list": [ 00:22:28.441 { 00:22:28.441 "name": null, 00:22:28.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:28.441 "is_configured": false, 00:22:28.441 "data_offset": 2048, 00:22:28.441 "data_size": 63488 00:22:28.441 }, 00:22:28.441 { 00:22:28.441 "name": "BaseBdev2", 00:22:28.441 "uuid": "7be2c657-301a-5aa3-a559-b956ba53dd5f", 00:22:28.441 "is_configured": true, 00:22:28.441 "data_offset": 2048, 00:22:28.441 "data_size": 63488 00:22:28.441 }, 00:22:28.441 { 00:22:28.441 "name": "BaseBdev3", 00:22:28.441 "uuid": "d7794971-ddae-591d-9366-d257f595e29e", 00:22:28.441 "is_configured": true, 00:22:28.441 "data_offset": 2048, 00:22:28.441 "data_size": 63488 00:22:28.441 }, 00:22:28.441 { 00:22:28.441 "name": "BaseBdev4", 00:22:28.441 "uuid": "db7dedd4-c9bf-5804-8c4f-d350fe18f48b", 00:22:28.441 "is_configured": true, 00:22:28.441 "data_offset": 2048, 00:22:28.441 "data_size": 63488 00:22:28.441 } 00:22:28.441 ] 00:22:28.441 }' 00:22:28.441 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:28.441 10:49:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:29.010 10:49:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:29.269 [2024-07-12 10:49:04.208854] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:29.269 [2024-07-12 10:49:04.208885] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:29.269 [2024-07-12 10:49:04.212013] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:29.269 [2024-07-12 10:49:04.212047] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:29.269 [2024-07-12 10:49:04.212144] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:29.269 [2024-07-12 10:49:04.212155] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f1c20 name raid_bdev1, state offline 00:22:29.269 0 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2120088 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2120088 ']' 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2120088 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2120088 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2120088' 00:22:29.269 killing process with pid 2120088 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2120088 00:22:29.269 [2024-07-12 10:49:04.276717] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:29.269 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2120088 00:22:29.270 [2024-07-12 10:49:04.308673] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Zr9wQ5i6IB 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:29.529 00:22:29.529 real 0m7.518s 00:22:29.529 user 0m12.004s 00:22:29.529 sys 0m1.332s 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:29.529 10:49:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:29.529 ************************************ 00:22:29.529 END TEST raid_write_error_test 00:22:29.530 ************************************ 00:22:29.530 10:49:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:29.530 10:49:04 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:22:29.530 10:49:04 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:22:29.530 10:49:04 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:22:29.530 10:49:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:29.530 10:49:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:29.530 10:49:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:29.530 ************************************ 00:22:29.530 START TEST raid_rebuild_test 00:22:29.530 ************************************ 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2121229 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2121229 /var/tmp/spdk-raid.sock 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2121229 ']' 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:29.530 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:29.530 10:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:29.530 [2024-07-12 10:49:04.683723] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:22:29.530 [2024-07-12 10:49:04.683787] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2121229 ] 00:22:29.530 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:29.530 Zero copy mechanism will not be used. 00:22:29.789 [2024-07-12 10:49:04.812229] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:29.789 [2024-07-12 10:49:04.914353] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:29.789 [2024-07-12 10:49:04.975472] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:29.789 [2024-07-12 10:49:04.975520] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:30.725 10:49:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:30.725 10:49:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:22:30.725 10:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:30.725 10:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:30.725 BaseBdev1_malloc 00:22:30.725 10:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:30.985 [2024-07-12 10:49:06.056389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:30.985 [2024-07-12 10:49:06.056437] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:30.985 [2024-07-12 10:49:06.056465] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f13d40 00:22:30.985 [2024-07-12 10:49:06.056478] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:30.985 [2024-07-12 10:49:06.058230] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:30.985 [2024-07-12 10:49:06.058261] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:30.985 BaseBdev1 00:22:30.985 10:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:30.985 10:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:31.553 BaseBdev2_malloc 00:22:31.554 10:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:31.812 [2024-07-12 10:49:06.803226] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:31.812 [2024-07-12 10:49:06.803274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:31.812 [2024-07-12 10:49:06.803301] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f14860 00:22:31.812 [2024-07-12 10:49:06.803313] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:31.812 [2024-07-12 10:49:06.804868] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:31.812 [2024-07-12 10:49:06.804897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:31.812 BaseBdev2 00:22:31.812 10:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:32.072 spare_malloc 00:22:32.072 10:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:32.642 spare_delay 00:22:32.642 10:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:32.642 [2024-07-12 10:49:07.782424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:32.642 [2024-07-12 10:49:07.782472] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.642 [2024-07-12 10:49:07.782501] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20c2ec0 00:22:32.642 [2024-07-12 10:49:07.782515] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.642 [2024-07-12 10:49:07.784126] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.642 [2024-07-12 10:49:07.784157] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:32.642 spare 00:22:32.642 10:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:32.901 [2024-07-12 10:49:08.011035] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:32.901 [2024-07-12 10:49:08.012348] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:32.901 [2024-07-12 10:49:08.012425] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20c4070 00:22:32.901 [2024-07-12 10:49:08.012436] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:32.901 [2024-07-12 10:49:08.012651] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20bd490 00:22:32.901 [2024-07-12 10:49:08.012794] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20c4070 00:22:32.901 [2024-07-12 10:49:08.012804] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20c4070 00:22:32.901 [2024-07-12 10:49:08.012918] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.901 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.467 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.467 "name": "raid_bdev1", 00:22:33.467 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:33.467 "strip_size_kb": 0, 00:22:33.467 "state": "online", 00:22:33.467 "raid_level": "raid1", 00:22:33.467 "superblock": false, 00:22:33.467 "num_base_bdevs": 2, 00:22:33.467 "num_base_bdevs_discovered": 2, 00:22:33.467 "num_base_bdevs_operational": 2, 00:22:33.467 "base_bdevs_list": [ 00:22:33.467 { 00:22:33.467 "name": "BaseBdev1", 00:22:33.467 "uuid": "8068f7e0-21f7-59da-9d25-e5a6c4b480d3", 00:22:33.467 "is_configured": true, 00:22:33.467 "data_offset": 0, 00:22:33.467 "data_size": 65536 00:22:33.467 }, 00:22:33.467 { 00:22:33.467 "name": "BaseBdev2", 00:22:33.467 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:33.467 "is_configured": true, 00:22:33.467 "data_offset": 0, 00:22:33.467 "data_size": 65536 00:22:33.467 } 00:22:33.467 ] 00:22:33.467 }' 00:22:33.467 10:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.467 10:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:34.244 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:34.244 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:34.244 [2024-07-12 10:49:09.358831] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:34.244 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:34.244 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.244 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.501 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:34.758 [2024-07-12 10:49:09.855958] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20bd490 00:22:34.758 /dev/nbd0 00:22:34.758 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:34.758 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:34.758 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:34.758 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:34.758 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:34.759 1+0 records in 00:22:34.759 1+0 records out 00:22:34.759 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268184 s, 15.3 MB/s 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:34.759 10:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:22:41.317 65536+0 records in 00:22:41.317 65536+0 records out 00:22:41.317 33554432 bytes (34 MB, 32 MiB) copied, 5.51067 s, 6.1 MB/s 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:41.317 [2024-07-12 10:49:15.695034] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:41.317 [2024-07-12 10:49:15.871786] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.317 10:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.317 10:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.317 "name": "raid_bdev1", 00:22:41.317 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:41.317 "strip_size_kb": 0, 00:22:41.317 "state": "online", 00:22:41.317 "raid_level": "raid1", 00:22:41.317 "superblock": false, 00:22:41.317 "num_base_bdevs": 2, 00:22:41.317 "num_base_bdevs_discovered": 1, 00:22:41.317 "num_base_bdevs_operational": 1, 00:22:41.317 "base_bdevs_list": [ 00:22:41.317 { 00:22:41.317 "name": null, 00:22:41.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.317 "is_configured": false, 00:22:41.317 "data_offset": 0, 00:22:41.317 "data_size": 65536 00:22:41.317 }, 00:22:41.317 { 00:22:41.317 "name": "BaseBdev2", 00:22:41.317 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:41.317 "is_configured": true, 00:22:41.317 "data_offset": 0, 00:22:41.317 "data_size": 65536 00:22:41.317 } 00:22:41.317 ] 00:22:41.317 }' 00:22:41.317 10:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.317 10:49:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:41.576 10:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:41.834 [2024-07-12 10:49:16.874492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:41.834 [2024-07-12 10:49:16.879408] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20bed90 00:22:41.834 [2024-07-12 10:49:16.881666] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:41.834 10:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:42.769 10:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:42.769 10:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:42.769 10:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:42.769 10:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:42.769 10:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:42.769 10:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.769 10:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.028 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:43.028 "name": "raid_bdev1", 00:22:43.028 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:43.028 "strip_size_kb": 0, 00:22:43.028 "state": "online", 00:22:43.028 "raid_level": "raid1", 00:22:43.028 "superblock": false, 00:22:43.028 "num_base_bdevs": 2, 00:22:43.028 "num_base_bdevs_discovered": 2, 00:22:43.028 "num_base_bdevs_operational": 2, 00:22:43.028 "process": { 00:22:43.028 "type": "rebuild", 00:22:43.028 "target": "spare", 00:22:43.028 "progress": { 00:22:43.028 "blocks": 24576, 00:22:43.028 "percent": 37 00:22:43.028 } 00:22:43.028 }, 00:22:43.028 "base_bdevs_list": [ 00:22:43.028 { 00:22:43.028 "name": "spare", 00:22:43.028 "uuid": "e5369efb-24a0-51d5-9538-5c644e927c96", 00:22:43.028 "is_configured": true, 00:22:43.028 "data_offset": 0, 00:22:43.028 "data_size": 65536 00:22:43.028 }, 00:22:43.028 { 00:22:43.028 "name": "BaseBdev2", 00:22:43.028 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:43.028 "is_configured": true, 00:22:43.028 "data_offset": 0, 00:22:43.028 "data_size": 65536 00:22:43.028 } 00:22:43.028 ] 00:22:43.028 }' 00:22:43.028 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:43.028 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:43.028 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:43.286 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:43.286 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:43.544 [2024-07-12 10:49:18.721586] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:43.803 [2024-07-12 10:49:18.796578] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:43.803 [2024-07-12 10:49:18.796630] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:43.803 [2024-07-12 10:49:18.796646] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:43.803 [2024-07-12 10:49:18.796654] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.803 10:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.061 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.061 "name": "raid_bdev1", 00:22:44.061 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:44.061 "strip_size_kb": 0, 00:22:44.061 "state": "online", 00:22:44.061 "raid_level": "raid1", 00:22:44.061 "superblock": false, 00:22:44.061 "num_base_bdevs": 2, 00:22:44.061 "num_base_bdevs_discovered": 1, 00:22:44.061 "num_base_bdevs_operational": 1, 00:22:44.061 "base_bdevs_list": [ 00:22:44.061 { 00:22:44.061 "name": null, 00:22:44.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.061 "is_configured": false, 00:22:44.061 "data_offset": 0, 00:22:44.061 "data_size": 65536 00:22:44.061 }, 00:22:44.061 { 00:22:44.061 "name": "BaseBdev2", 00:22:44.061 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:44.061 "is_configured": true, 00:22:44.061 "data_offset": 0, 00:22:44.061 "data_size": 65536 00:22:44.061 } 00:22:44.061 ] 00:22:44.061 }' 00:22:44.061 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.061 10:49:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:44.626 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:44.626 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:44.626 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:44.626 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:44.626 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:44.626 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.626 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.885 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:44.885 "name": "raid_bdev1", 00:22:44.885 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:44.885 "strip_size_kb": 0, 00:22:44.885 "state": "online", 00:22:44.885 "raid_level": "raid1", 00:22:44.885 "superblock": false, 00:22:44.885 "num_base_bdevs": 2, 00:22:44.885 "num_base_bdevs_discovered": 1, 00:22:44.885 "num_base_bdevs_operational": 1, 00:22:44.885 "base_bdevs_list": [ 00:22:44.885 { 00:22:44.885 "name": null, 00:22:44.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.885 "is_configured": false, 00:22:44.885 "data_offset": 0, 00:22:44.885 "data_size": 65536 00:22:44.885 }, 00:22:44.885 { 00:22:44.885 "name": "BaseBdev2", 00:22:44.885 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:44.885 "is_configured": true, 00:22:44.885 "data_offset": 0, 00:22:44.885 "data_size": 65536 00:22:44.885 } 00:22:44.885 ] 00:22:44.885 }' 00:22:44.885 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:44.885 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:44.886 10:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:44.886 10:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:44.886 10:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:45.144 [2024-07-12 10:49:20.252809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:45.144 [2024-07-12 10:49:20.258533] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f136a0 00:22:45.144 [2024-07-12 10:49:20.260056] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:45.144 10:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.587 "name": "raid_bdev1", 00:22:46.587 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:46.587 "strip_size_kb": 0, 00:22:46.587 "state": "online", 00:22:46.587 "raid_level": "raid1", 00:22:46.587 "superblock": false, 00:22:46.587 "num_base_bdevs": 2, 00:22:46.587 "num_base_bdevs_discovered": 2, 00:22:46.587 "num_base_bdevs_operational": 2, 00:22:46.587 "process": { 00:22:46.587 "type": "rebuild", 00:22:46.587 "target": "spare", 00:22:46.587 "progress": { 00:22:46.587 "blocks": 24576, 00:22:46.587 "percent": 37 00:22:46.587 } 00:22:46.587 }, 00:22:46.587 "base_bdevs_list": [ 00:22:46.587 { 00:22:46.587 "name": "spare", 00:22:46.587 "uuid": "e5369efb-24a0-51d5-9538-5c644e927c96", 00:22:46.587 "is_configured": true, 00:22:46.587 "data_offset": 0, 00:22:46.587 "data_size": 65536 00:22:46.587 }, 00:22:46.587 { 00:22:46.587 "name": "BaseBdev2", 00:22:46.587 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:46.587 "is_configured": true, 00:22:46.587 "data_offset": 0, 00:22:46.587 "data_size": 65536 00:22:46.587 } 00:22:46.587 ] 00:22:46.587 }' 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=757 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.587 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.852 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.852 "name": "raid_bdev1", 00:22:46.852 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:46.852 "strip_size_kb": 0, 00:22:46.852 "state": "online", 00:22:46.852 "raid_level": "raid1", 00:22:46.852 "superblock": false, 00:22:46.852 "num_base_bdevs": 2, 00:22:46.852 "num_base_bdevs_discovered": 2, 00:22:46.852 "num_base_bdevs_operational": 2, 00:22:46.852 "process": { 00:22:46.852 "type": "rebuild", 00:22:46.852 "target": "spare", 00:22:46.852 "progress": { 00:22:46.852 "blocks": 30720, 00:22:46.852 "percent": 46 00:22:46.852 } 00:22:46.852 }, 00:22:46.852 "base_bdevs_list": [ 00:22:46.852 { 00:22:46.852 "name": "spare", 00:22:46.852 "uuid": "e5369efb-24a0-51d5-9538-5c644e927c96", 00:22:46.852 "is_configured": true, 00:22:46.852 "data_offset": 0, 00:22:46.852 "data_size": 65536 00:22:46.852 }, 00:22:46.852 { 00:22:46.852 "name": "BaseBdev2", 00:22:46.852 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:46.852 "is_configured": true, 00:22:46.852 "data_offset": 0, 00:22:46.852 "data_size": 65536 00:22:46.852 } 00:22:46.852 ] 00:22:46.852 }' 00:22:46.852 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:46.852 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:46.852 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:46.852 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:46.852 10:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:47.785 10:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:47.785 10:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:47.785 10:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:47.785 10:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:47.785 10:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:47.785 10:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:47.785 10:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.785 10:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.044 10:49:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:48.044 "name": "raid_bdev1", 00:22:48.044 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:48.044 "strip_size_kb": 0, 00:22:48.044 "state": "online", 00:22:48.044 "raid_level": "raid1", 00:22:48.044 "superblock": false, 00:22:48.044 "num_base_bdevs": 2, 00:22:48.044 "num_base_bdevs_discovered": 2, 00:22:48.044 "num_base_bdevs_operational": 2, 00:22:48.044 "process": { 00:22:48.044 "type": "rebuild", 00:22:48.044 "target": "spare", 00:22:48.044 "progress": { 00:22:48.044 "blocks": 57344, 00:22:48.044 "percent": 87 00:22:48.044 } 00:22:48.044 }, 00:22:48.044 "base_bdevs_list": [ 00:22:48.044 { 00:22:48.044 "name": "spare", 00:22:48.044 "uuid": "e5369efb-24a0-51d5-9538-5c644e927c96", 00:22:48.044 "is_configured": true, 00:22:48.044 "data_offset": 0, 00:22:48.044 "data_size": 65536 00:22:48.044 }, 00:22:48.044 { 00:22:48.044 "name": "BaseBdev2", 00:22:48.044 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:48.044 "is_configured": true, 00:22:48.044 "data_offset": 0, 00:22:48.044 "data_size": 65536 00:22:48.044 } 00:22:48.044 ] 00:22:48.044 }' 00:22:48.044 10:49:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:48.302 10:49:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:48.302 10:49:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:48.302 10:49:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:48.302 10:49:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:48.302 [2024-07-12 10:49:23.484719] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:48.302 [2024-07-12 10:49:23.484776] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:48.302 [2024-07-12 10:49:23.484813] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.234 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:49.234 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:49.234 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:49.234 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:49.234 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:49.234 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:49.234 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.234 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:49.492 "name": "raid_bdev1", 00:22:49.492 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:49.492 "strip_size_kb": 0, 00:22:49.492 "state": "online", 00:22:49.492 "raid_level": "raid1", 00:22:49.492 "superblock": false, 00:22:49.492 "num_base_bdevs": 2, 00:22:49.492 "num_base_bdevs_discovered": 2, 00:22:49.492 "num_base_bdevs_operational": 2, 00:22:49.492 "base_bdevs_list": [ 00:22:49.492 { 00:22:49.492 "name": "spare", 00:22:49.492 "uuid": "e5369efb-24a0-51d5-9538-5c644e927c96", 00:22:49.492 "is_configured": true, 00:22:49.492 "data_offset": 0, 00:22:49.492 "data_size": 65536 00:22:49.492 }, 00:22:49.492 { 00:22:49.492 "name": "BaseBdev2", 00:22:49.492 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:49.492 "is_configured": true, 00:22:49.492 "data_offset": 0, 00:22:49.492 "data_size": 65536 00:22:49.492 } 00:22:49.492 ] 00:22:49.492 }' 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.492 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.750 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:49.750 "name": "raid_bdev1", 00:22:49.750 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:49.750 "strip_size_kb": 0, 00:22:49.750 "state": "online", 00:22:49.750 "raid_level": "raid1", 00:22:49.750 "superblock": false, 00:22:49.750 "num_base_bdevs": 2, 00:22:49.750 "num_base_bdevs_discovered": 2, 00:22:49.750 "num_base_bdevs_operational": 2, 00:22:49.750 "base_bdevs_list": [ 00:22:49.750 { 00:22:49.750 "name": "spare", 00:22:49.750 "uuid": "e5369efb-24a0-51d5-9538-5c644e927c96", 00:22:49.750 "is_configured": true, 00:22:49.750 "data_offset": 0, 00:22:49.750 "data_size": 65536 00:22:49.750 }, 00:22:49.750 { 00:22:49.750 "name": "BaseBdev2", 00:22:49.750 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:49.750 "is_configured": true, 00:22:49.750 "data_offset": 0, 00:22:49.750 "data_size": 65536 00:22:49.750 } 00:22:49.750 ] 00:22:49.750 }' 00:22:49.750 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:49.751 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:49.751 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:50.010 10:49:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.268 10:49:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:50.268 "name": "raid_bdev1", 00:22:50.268 "uuid": "a4ef62d0-97b2-4d2b-9545-471937b78534", 00:22:50.268 "strip_size_kb": 0, 00:22:50.268 "state": "online", 00:22:50.268 "raid_level": "raid1", 00:22:50.268 "superblock": false, 00:22:50.268 "num_base_bdevs": 2, 00:22:50.268 "num_base_bdevs_discovered": 2, 00:22:50.268 "num_base_bdevs_operational": 2, 00:22:50.268 "base_bdevs_list": [ 00:22:50.268 { 00:22:50.268 "name": "spare", 00:22:50.268 "uuid": "e5369efb-24a0-51d5-9538-5c644e927c96", 00:22:50.268 "is_configured": true, 00:22:50.268 "data_offset": 0, 00:22:50.268 "data_size": 65536 00:22:50.268 }, 00:22:50.268 { 00:22:50.268 "name": "BaseBdev2", 00:22:50.268 "uuid": "e00d0bf9-db60-56d1-9063-c2a404c0857a", 00:22:50.268 "is_configured": true, 00:22:50.268 "data_offset": 0, 00:22:50.268 "data_size": 65536 00:22:50.268 } 00:22:50.268 ] 00:22:50.268 }' 00:22:50.268 10:49:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:50.268 10:49:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:50.834 10:49:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:51.093 [2024-07-12 10:49:26.040802] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:51.093 [2024-07-12 10:49:26.040830] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:51.093 [2024-07-12 10:49:26.040884] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:51.093 [2024-07-12 10:49:26.040938] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:51.093 [2024-07-12 10:49:26.040951] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c4070 name raid_bdev1, state offline 00:22:51.093 10:49:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.093 10:49:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:51.352 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:51.610 /dev/nbd0 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:51.610 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:51.610 1+0 records in 00:22:51.610 1+0 records out 00:22:51.610 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026358 s, 15.5 MB/s 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:51.611 /dev/nbd1 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:51.611 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:51.869 1+0 records in 00:22:51.869 1+0 records out 00:22:51.869 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304358 s, 13.5 MB/s 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:51.869 10:49:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:52.127 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2121229 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2121229 ']' 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2121229 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2121229 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2121229' 00:22:52.386 killing process with pid 2121229 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2121229 00:22:52.386 Received shutdown signal, test time was about 60.000000 seconds 00:22:52.386 00:22:52.386 Latency(us) 00:22:52.386 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:52.386 =================================================================================================================== 00:22:52.386 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:52.386 [2024-07-12 10:49:27.501539] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:52.386 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2121229 00:22:52.386 [2024-07-12 10:49:27.529597] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:52.644 10:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:22:52.644 00:22:52.644 real 0m23.130s 00:22:52.644 user 0m31.373s 00:22:52.644 sys 0m5.182s 00:22:52.644 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:52.644 10:49:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:52.644 ************************************ 00:22:52.644 END TEST raid_rebuild_test 00:22:52.644 ************************************ 00:22:52.644 10:49:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:52.644 10:49:27 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:22:52.644 10:49:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:52.644 10:49:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:52.644 10:49:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:52.902 ************************************ 00:22:52.902 START TEST raid_rebuild_test_sb 00:22:52.902 ************************************ 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2124448 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2124448 /var/tmp/spdk-raid.sock 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2124448 ']' 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:52.902 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:52.902 10:49:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:52.902 [2024-07-12 10:49:27.914052] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:22:52.902 [2024-07-12 10:49:27.914127] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2124448 ] 00:22:52.902 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:52.902 Zero copy mechanism will not be used. 00:22:52.902 [2024-07-12 10:49:28.055937] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:53.160 [2024-07-12 10:49:28.171798] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:53.160 [2024-07-12 10:49:28.240036] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:53.160 [2024-07-12 10:49:28.240060] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:53.727 10:49:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:53.727 10:49:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:53.727 10:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:53.727 10:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:53.985 BaseBdev1_malloc 00:22:53.985 10:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:54.244 [2024-07-12 10:49:29.377489] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:54.244 [2024-07-12 10:49:29.377540] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.244 [2024-07-12 10:49:29.377566] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17fcd40 00:22:54.244 [2024-07-12 10:49:29.377579] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.244 [2024-07-12 10:49:29.379227] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.244 [2024-07-12 10:49:29.379259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:54.244 BaseBdev1 00:22:54.244 10:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:54.244 10:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:54.502 BaseBdev2_malloc 00:22:54.502 10:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:54.760 [2024-07-12 10:49:29.871731] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:54.760 [2024-07-12 10:49:29.871778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.760 [2024-07-12 10:49:29.871804] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17fd860 00:22:54.760 [2024-07-12 10:49:29.871817] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.760 [2024-07-12 10:49:29.873182] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.760 [2024-07-12 10:49:29.873209] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:54.760 BaseBdev2 00:22:54.760 10:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:55.018 spare_malloc 00:22:55.018 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:55.276 spare_delay 00:22:55.276 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:55.533 [2024-07-12 10:49:30.606376] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:55.533 [2024-07-12 10:49:30.606433] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.533 [2024-07-12 10:49:30.606456] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19abec0 00:22:55.533 [2024-07-12 10:49:30.606469] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.533 [2024-07-12 10:49:30.607974] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.534 [2024-07-12 10:49:30.608006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:55.534 spare 00:22:55.534 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:55.791 [2024-07-12 10:49:30.839017] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:55.791 [2024-07-12 10:49:30.840262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:55.791 [2024-07-12 10:49:30.840427] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19ad070 00:22:55.791 [2024-07-12 10:49:30.840440] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:55.791 [2024-07-12 10:49:30.840640] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a6490 00:22:55.791 [2024-07-12 10:49:30.840781] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19ad070 00:22:55.792 [2024-07-12 10:49:30.840792] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19ad070 00:22:55.792 [2024-07-12 10:49:30.840892] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.792 10:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.049 10:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.049 "name": "raid_bdev1", 00:22:56.049 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:22:56.049 "strip_size_kb": 0, 00:22:56.049 "state": "online", 00:22:56.049 "raid_level": "raid1", 00:22:56.049 "superblock": true, 00:22:56.049 "num_base_bdevs": 2, 00:22:56.049 "num_base_bdevs_discovered": 2, 00:22:56.049 "num_base_bdevs_operational": 2, 00:22:56.049 "base_bdevs_list": [ 00:22:56.049 { 00:22:56.049 "name": "BaseBdev1", 00:22:56.049 "uuid": "7bb00407-1a7f-551e-b61c-3dcdc57cda8e", 00:22:56.049 "is_configured": true, 00:22:56.049 "data_offset": 2048, 00:22:56.049 "data_size": 63488 00:22:56.049 }, 00:22:56.049 { 00:22:56.049 "name": "BaseBdev2", 00:22:56.049 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:22:56.049 "is_configured": true, 00:22:56.049 "data_offset": 2048, 00:22:56.049 "data_size": 63488 00:22:56.049 } 00:22:56.049 ] 00:22:56.049 }' 00:22:56.049 10:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.049 10:49:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:56.614 10:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:56.614 10:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:56.872 [2024-07-12 10:49:31.926113] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:56.872 10:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:56.872 10:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.872 10:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:57.130 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:57.388 [2024-07-12 10:49:32.423227] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a6490 00:22:57.388 /dev/nbd0 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:57.388 1+0 records in 00:22:57.388 1+0 records out 00:22:57.388 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234474 s, 17.5 MB/s 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:57.388 10:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:23:02.657 63488+0 records in 00:23:02.657 63488+0 records out 00:23:02.657 32505856 bytes (33 MB, 31 MiB) copied, 4.56317 s, 7.1 MB/s 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:02.657 [2024-07-12 10:49:37.318635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:02.657 [2024-07-12 10:49:37.550662] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.657 "name": "raid_bdev1", 00:23:02.657 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:02.657 "strip_size_kb": 0, 00:23:02.657 "state": "online", 00:23:02.657 "raid_level": "raid1", 00:23:02.657 "superblock": true, 00:23:02.657 "num_base_bdevs": 2, 00:23:02.657 "num_base_bdevs_discovered": 1, 00:23:02.657 "num_base_bdevs_operational": 1, 00:23:02.657 "base_bdevs_list": [ 00:23:02.657 { 00:23:02.657 "name": null, 00:23:02.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.657 "is_configured": false, 00:23:02.657 "data_offset": 2048, 00:23:02.657 "data_size": 63488 00:23:02.657 }, 00:23:02.657 { 00:23:02.657 "name": "BaseBdev2", 00:23:02.657 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:02.657 "is_configured": true, 00:23:02.657 "data_offset": 2048, 00:23:02.657 "data_size": 63488 00:23:02.657 } 00:23:02.657 ] 00:23:02.657 }' 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.657 10:49:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:03.226 10:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:03.485 [2024-07-12 10:49:38.585408] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:03.485 [2024-07-12 10:49:38.590430] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19acce0 00:23:03.485 [2024-07-12 10:49:38.592690] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:03.485 10:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:04.461 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:04.462 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:04.462 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:04.462 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:04.462 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:04.462 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.462 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.721 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:04.721 "name": "raid_bdev1", 00:23:04.721 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:04.721 "strip_size_kb": 0, 00:23:04.721 "state": "online", 00:23:04.721 "raid_level": "raid1", 00:23:04.721 "superblock": true, 00:23:04.721 "num_base_bdevs": 2, 00:23:04.721 "num_base_bdevs_discovered": 2, 00:23:04.721 "num_base_bdevs_operational": 2, 00:23:04.721 "process": { 00:23:04.721 "type": "rebuild", 00:23:04.721 "target": "spare", 00:23:04.721 "progress": { 00:23:04.721 "blocks": 24576, 00:23:04.721 "percent": 38 00:23:04.721 } 00:23:04.721 }, 00:23:04.721 "base_bdevs_list": [ 00:23:04.721 { 00:23:04.721 "name": "spare", 00:23:04.721 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:04.721 "is_configured": true, 00:23:04.721 "data_offset": 2048, 00:23:04.721 "data_size": 63488 00:23:04.721 }, 00:23:04.721 { 00:23:04.721 "name": "BaseBdev2", 00:23:04.721 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:04.721 "is_configured": true, 00:23:04.721 "data_offset": 2048, 00:23:04.721 "data_size": 63488 00:23:04.721 } 00:23:04.721 ] 00:23:04.721 }' 00:23:04.721 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:04.721 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:04.721 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:04.980 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:04.981 10:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:05.240 [2024-07-12 10:49:40.186605] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:05.240 [2024-07-12 10:49:40.205464] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:05.240 [2024-07-12 10:49:40.205514] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:05.240 [2024-07-12 10:49:40.205530] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:05.240 [2024-07-12 10:49:40.205538] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.240 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.499 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.499 "name": "raid_bdev1", 00:23:05.499 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:05.499 "strip_size_kb": 0, 00:23:05.499 "state": "online", 00:23:05.499 "raid_level": "raid1", 00:23:05.499 "superblock": true, 00:23:05.499 "num_base_bdevs": 2, 00:23:05.499 "num_base_bdevs_discovered": 1, 00:23:05.499 "num_base_bdevs_operational": 1, 00:23:05.499 "base_bdevs_list": [ 00:23:05.499 { 00:23:05.499 "name": null, 00:23:05.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.499 "is_configured": false, 00:23:05.499 "data_offset": 2048, 00:23:05.499 "data_size": 63488 00:23:05.499 }, 00:23:05.499 { 00:23:05.499 "name": "BaseBdev2", 00:23:05.499 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:05.499 "is_configured": true, 00:23:05.499 "data_offset": 2048, 00:23:05.499 "data_size": 63488 00:23:05.499 } 00:23:05.499 ] 00:23:05.499 }' 00:23:05.499 10:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.499 10:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:06.068 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:06.068 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:06.068 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:06.068 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:06.068 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:06.068 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.068 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.328 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:06.328 "name": "raid_bdev1", 00:23:06.328 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:06.328 "strip_size_kb": 0, 00:23:06.328 "state": "online", 00:23:06.328 "raid_level": "raid1", 00:23:06.328 "superblock": true, 00:23:06.328 "num_base_bdevs": 2, 00:23:06.328 "num_base_bdevs_discovered": 1, 00:23:06.328 "num_base_bdevs_operational": 1, 00:23:06.328 "base_bdevs_list": [ 00:23:06.328 { 00:23:06.328 "name": null, 00:23:06.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.328 "is_configured": false, 00:23:06.328 "data_offset": 2048, 00:23:06.328 "data_size": 63488 00:23:06.328 }, 00:23:06.328 { 00:23:06.328 "name": "BaseBdev2", 00:23:06.328 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:06.328 "is_configured": true, 00:23:06.328 "data_offset": 2048, 00:23:06.328 "data_size": 63488 00:23:06.328 } 00:23:06.328 ] 00:23:06.328 }' 00:23:06.328 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:06.328 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:06.328 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:06.328 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:06.328 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:06.587 [2024-07-12 10:49:41.629840] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:06.587 [2024-07-12 10:49:41.635512] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19acce0 00:23:06.587 [2024-07-12 10:49:41.637063] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:06.587 10:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:07.525 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:07.525 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:07.525 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:07.525 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:07.525 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:07.525 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.525 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:07.785 "name": "raid_bdev1", 00:23:07.785 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:07.785 "strip_size_kb": 0, 00:23:07.785 "state": "online", 00:23:07.785 "raid_level": "raid1", 00:23:07.785 "superblock": true, 00:23:07.785 "num_base_bdevs": 2, 00:23:07.785 "num_base_bdevs_discovered": 2, 00:23:07.785 "num_base_bdevs_operational": 2, 00:23:07.785 "process": { 00:23:07.785 "type": "rebuild", 00:23:07.785 "target": "spare", 00:23:07.785 "progress": { 00:23:07.785 "blocks": 22528, 00:23:07.785 "percent": 35 00:23:07.785 } 00:23:07.785 }, 00:23:07.785 "base_bdevs_list": [ 00:23:07.785 { 00:23:07.785 "name": "spare", 00:23:07.785 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:07.785 "is_configured": true, 00:23:07.785 "data_offset": 2048, 00:23:07.785 "data_size": 63488 00:23:07.785 }, 00:23:07.785 { 00:23:07.785 "name": "BaseBdev2", 00:23:07.785 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:07.785 "is_configured": true, 00:23:07.785 "data_offset": 2048, 00:23:07.785 "data_size": 63488 00:23:07.785 } 00:23:07.785 ] 00:23:07.785 }' 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:07.785 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=778 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.785 10:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.043 10:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:08.043 "name": "raid_bdev1", 00:23:08.043 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:08.043 "strip_size_kb": 0, 00:23:08.043 "state": "online", 00:23:08.043 "raid_level": "raid1", 00:23:08.043 "superblock": true, 00:23:08.043 "num_base_bdevs": 2, 00:23:08.043 "num_base_bdevs_discovered": 2, 00:23:08.043 "num_base_bdevs_operational": 2, 00:23:08.043 "process": { 00:23:08.043 "type": "rebuild", 00:23:08.043 "target": "spare", 00:23:08.043 "progress": { 00:23:08.043 "blocks": 30720, 00:23:08.043 "percent": 48 00:23:08.043 } 00:23:08.043 }, 00:23:08.043 "base_bdevs_list": [ 00:23:08.043 { 00:23:08.043 "name": "spare", 00:23:08.043 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:08.043 "is_configured": true, 00:23:08.043 "data_offset": 2048, 00:23:08.043 "data_size": 63488 00:23:08.043 }, 00:23:08.043 { 00:23:08.043 "name": "BaseBdev2", 00:23:08.043 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:08.043 "is_configured": true, 00:23:08.043 "data_offset": 2048, 00:23:08.043 "data_size": 63488 00:23:08.043 } 00:23:08.043 ] 00:23:08.043 }' 00:23:08.043 10:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:08.043 10:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:08.043 10:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:08.300 10:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:08.300 10:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:09.234 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:09.234 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:09.234 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:09.234 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:09.234 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:09.234 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:09.234 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.234 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.492 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:09.492 "name": "raid_bdev1", 00:23:09.492 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:09.492 "strip_size_kb": 0, 00:23:09.492 "state": "online", 00:23:09.492 "raid_level": "raid1", 00:23:09.492 "superblock": true, 00:23:09.492 "num_base_bdevs": 2, 00:23:09.492 "num_base_bdevs_discovered": 2, 00:23:09.492 "num_base_bdevs_operational": 2, 00:23:09.492 "process": { 00:23:09.492 "type": "rebuild", 00:23:09.492 "target": "spare", 00:23:09.492 "progress": { 00:23:09.492 "blocks": 55296, 00:23:09.492 "percent": 87 00:23:09.492 } 00:23:09.492 }, 00:23:09.492 "base_bdevs_list": [ 00:23:09.492 { 00:23:09.492 "name": "spare", 00:23:09.492 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:09.492 "is_configured": true, 00:23:09.492 "data_offset": 2048, 00:23:09.492 "data_size": 63488 00:23:09.492 }, 00:23:09.492 { 00:23:09.492 "name": "BaseBdev2", 00:23:09.492 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:09.492 "is_configured": true, 00:23:09.492 "data_offset": 2048, 00:23:09.492 "data_size": 63488 00:23:09.492 } 00:23:09.492 ] 00:23:09.492 }' 00:23:09.492 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:09.492 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:09.492 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:09.492 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:09.492 10:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:09.750 [2024-07-12 10:49:44.761293] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:09.750 [2024-07-12 10:49:44.761355] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:09.750 [2024-07-12 10:49:44.761440] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:10.684 "name": "raid_bdev1", 00:23:10.684 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:10.684 "strip_size_kb": 0, 00:23:10.684 "state": "online", 00:23:10.684 "raid_level": "raid1", 00:23:10.684 "superblock": true, 00:23:10.684 "num_base_bdevs": 2, 00:23:10.684 "num_base_bdevs_discovered": 2, 00:23:10.684 "num_base_bdevs_operational": 2, 00:23:10.684 "base_bdevs_list": [ 00:23:10.684 { 00:23:10.684 "name": "spare", 00:23:10.684 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:10.684 "is_configured": true, 00:23:10.684 "data_offset": 2048, 00:23:10.684 "data_size": 63488 00:23:10.684 }, 00:23:10.684 { 00:23:10.684 "name": "BaseBdev2", 00:23:10.684 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:10.684 "is_configured": true, 00:23:10.684 "data_offset": 2048, 00:23:10.684 "data_size": 63488 00:23:10.684 } 00:23:10.684 ] 00:23:10.684 }' 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:10.684 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:10.941 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:10.941 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:23:10.941 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:10.941 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:10.941 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:10.941 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:10.941 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:10.941 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.941 10:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:11.199 "name": "raid_bdev1", 00:23:11.199 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:11.199 "strip_size_kb": 0, 00:23:11.199 "state": "online", 00:23:11.199 "raid_level": "raid1", 00:23:11.199 "superblock": true, 00:23:11.199 "num_base_bdevs": 2, 00:23:11.199 "num_base_bdevs_discovered": 2, 00:23:11.199 "num_base_bdevs_operational": 2, 00:23:11.199 "base_bdevs_list": [ 00:23:11.199 { 00:23:11.199 "name": "spare", 00:23:11.199 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:11.199 "is_configured": true, 00:23:11.199 "data_offset": 2048, 00:23:11.199 "data_size": 63488 00:23:11.199 }, 00:23:11.199 { 00:23:11.199 "name": "BaseBdev2", 00:23:11.199 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:11.199 "is_configured": true, 00:23:11.199 "data_offset": 2048, 00:23:11.199 "data_size": 63488 00:23:11.199 } 00:23:11.199 ] 00:23:11.199 }' 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.199 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.456 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.456 "name": "raid_bdev1", 00:23:11.456 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:11.456 "strip_size_kb": 0, 00:23:11.456 "state": "online", 00:23:11.456 "raid_level": "raid1", 00:23:11.456 "superblock": true, 00:23:11.456 "num_base_bdevs": 2, 00:23:11.456 "num_base_bdevs_discovered": 2, 00:23:11.456 "num_base_bdevs_operational": 2, 00:23:11.456 "base_bdevs_list": [ 00:23:11.457 { 00:23:11.457 "name": "spare", 00:23:11.457 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:11.457 "is_configured": true, 00:23:11.457 "data_offset": 2048, 00:23:11.457 "data_size": 63488 00:23:11.457 }, 00:23:11.457 { 00:23:11.457 "name": "BaseBdev2", 00:23:11.457 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:11.457 "is_configured": true, 00:23:11.457 "data_offset": 2048, 00:23:11.457 "data_size": 63488 00:23:11.457 } 00:23:11.457 ] 00:23:11.457 }' 00:23:11.457 10:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.457 10:49:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:12.021 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:12.279 [2024-07-12 10:49:47.216994] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:12.279 [2024-07-12 10:49:47.217024] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:12.279 [2024-07-12 10:49:47.217085] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:12.279 [2024-07-12 10:49:47.217140] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:12.279 [2024-07-12 10:49:47.217151] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19ad070 name raid_bdev1, state offline 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:12.279 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:12.537 /dev/nbd0 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:12.537 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:12.538 1+0 records in 00:23:12.538 1+0 records out 00:23:12.538 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000150189 s, 27.3 MB/s 00:23:12.538 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:12.538 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:12.538 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:12.538 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:12.538 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:12.538 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:12.538 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:12.538 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:12.795 /dev/nbd1 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:12.795 1+0 records in 00:23:12.795 1+0 records out 00:23:12.795 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319184 s, 12.8 MB/s 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:12.795 10:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:13.359 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:13.616 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:13.874 [2024-07-12 10:49:48.852650] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:13.874 [2024-07-12 10:49:48.852694] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:13.874 [2024-07-12 10:49:48.852721] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a9ad0 00:23:13.874 [2024-07-12 10:49:48.852734] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:13.874 [2024-07-12 10:49:48.854367] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:13.874 [2024-07-12 10:49:48.854398] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:13.874 [2024-07-12 10:49:48.854501] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:13.874 [2024-07-12 10:49:48.854528] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:13.874 [2024-07-12 10:49:48.854638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:13.874 spare 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.874 10:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.874 [2024-07-12 10:49:48.954951] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19ad2f0 00:23:13.874 [2024-07-12 10:49:48.954968] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:13.874 [2024-07-12 10:49:48.955166] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17fc680 00:23:13.874 [2024-07-12 10:49:48.955311] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19ad2f0 00:23:13.874 [2024-07-12 10:49:48.955322] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19ad2f0 00:23:13.874 [2024-07-12 10:49:48.955426] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:13.874 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.874 "name": "raid_bdev1", 00:23:13.874 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:13.874 "strip_size_kb": 0, 00:23:13.874 "state": "online", 00:23:13.874 "raid_level": "raid1", 00:23:13.874 "superblock": true, 00:23:13.874 "num_base_bdevs": 2, 00:23:13.874 "num_base_bdevs_discovered": 2, 00:23:13.874 "num_base_bdevs_operational": 2, 00:23:13.874 "base_bdevs_list": [ 00:23:13.874 { 00:23:13.874 "name": "spare", 00:23:13.874 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:13.874 "is_configured": true, 00:23:13.874 "data_offset": 2048, 00:23:13.874 "data_size": 63488 00:23:13.874 }, 00:23:13.874 { 00:23:13.874 "name": "BaseBdev2", 00:23:13.874 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:13.874 "is_configured": true, 00:23:13.874 "data_offset": 2048, 00:23:13.874 "data_size": 63488 00:23:13.874 } 00:23:13.874 ] 00:23:13.874 }' 00:23:13.874 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.874 10:49:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:14.806 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:14.806 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:14.806 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:14.806 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:14.806 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:14.806 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.806 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.806 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.806 "name": "raid_bdev1", 00:23:14.806 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:14.806 "strip_size_kb": 0, 00:23:14.806 "state": "online", 00:23:14.806 "raid_level": "raid1", 00:23:14.806 "superblock": true, 00:23:14.806 "num_base_bdevs": 2, 00:23:14.806 "num_base_bdevs_discovered": 2, 00:23:14.806 "num_base_bdevs_operational": 2, 00:23:14.806 "base_bdevs_list": [ 00:23:14.806 { 00:23:14.806 "name": "spare", 00:23:14.806 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:14.806 "is_configured": true, 00:23:14.806 "data_offset": 2048, 00:23:14.806 "data_size": 63488 00:23:14.806 }, 00:23:14.806 { 00:23:14.806 "name": "BaseBdev2", 00:23:14.806 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:14.806 "is_configured": true, 00:23:14.806 "data_offset": 2048, 00:23:14.806 "data_size": 63488 00:23:14.806 } 00:23:14.806 ] 00:23:14.806 }' 00:23:14.807 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.807 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:14.807 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.807 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:14.807 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.807 10:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:15.371 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:15.371 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:15.629 [2024-07-12 10:49:50.725752] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.629 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.888 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:15.888 "name": "raid_bdev1", 00:23:15.888 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:15.888 "strip_size_kb": 0, 00:23:15.888 "state": "online", 00:23:15.888 "raid_level": "raid1", 00:23:15.888 "superblock": true, 00:23:15.888 "num_base_bdevs": 2, 00:23:15.888 "num_base_bdevs_discovered": 1, 00:23:15.888 "num_base_bdevs_operational": 1, 00:23:15.888 "base_bdevs_list": [ 00:23:15.888 { 00:23:15.888 "name": null, 00:23:15.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.888 "is_configured": false, 00:23:15.888 "data_offset": 2048, 00:23:15.888 "data_size": 63488 00:23:15.888 }, 00:23:15.888 { 00:23:15.888 "name": "BaseBdev2", 00:23:15.888 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:15.888 "is_configured": true, 00:23:15.888 "data_offset": 2048, 00:23:15.888 "data_size": 63488 00:23:15.888 } 00:23:15.888 ] 00:23:15.888 }' 00:23:15.888 10:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:15.888 10:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:16.455 10:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:16.714 [2024-07-12 10:49:51.748495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:16.714 [2024-07-12 10:49:51.748655] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:16.714 [2024-07-12 10:49:51.748676] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:16.714 [2024-07-12 10:49:51.748706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:16.714 [2024-07-12 10:49:51.753565] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a74a0 00:23:16.714 [2024-07-12 10:49:51.756018] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:16.714 10:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:17.651 10:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:17.651 10:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.651 10:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:17.651 10:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:17.651 10:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.651 10:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.651 10:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.910 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.910 "name": "raid_bdev1", 00:23:17.910 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:17.910 "strip_size_kb": 0, 00:23:17.910 "state": "online", 00:23:17.910 "raid_level": "raid1", 00:23:17.910 "superblock": true, 00:23:17.910 "num_base_bdevs": 2, 00:23:17.910 "num_base_bdevs_discovered": 2, 00:23:17.910 "num_base_bdevs_operational": 2, 00:23:17.910 "process": { 00:23:17.910 "type": "rebuild", 00:23:17.910 "target": "spare", 00:23:17.910 "progress": { 00:23:17.910 "blocks": 24576, 00:23:17.910 "percent": 38 00:23:17.910 } 00:23:17.910 }, 00:23:17.910 "base_bdevs_list": [ 00:23:17.910 { 00:23:17.910 "name": "spare", 00:23:17.910 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:17.910 "is_configured": true, 00:23:17.910 "data_offset": 2048, 00:23:17.910 "data_size": 63488 00:23:17.910 }, 00:23:17.910 { 00:23:17.910 "name": "BaseBdev2", 00:23:17.910 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:17.910 "is_configured": true, 00:23:17.910 "data_offset": 2048, 00:23:17.910 "data_size": 63488 00:23:17.910 } 00:23:17.910 ] 00:23:17.910 }' 00:23:17.910 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.910 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:17.910 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:18.167 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:18.167 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:18.424 [2024-07-12 10:49:53.369961] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:18.424 [2024-07-12 10:49:53.469420] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:18.424 [2024-07-12 10:49:53.469470] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:18.424 [2024-07-12 10:49:53.469492] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:18.424 [2024-07-12 10:49:53.469501] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.424 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.682 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.682 "name": "raid_bdev1", 00:23:18.682 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:18.682 "strip_size_kb": 0, 00:23:18.682 "state": "online", 00:23:18.682 "raid_level": "raid1", 00:23:18.682 "superblock": true, 00:23:18.682 "num_base_bdevs": 2, 00:23:18.682 "num_base_bdevs_discovered": 1, 00:23:18.682 "num_base_bdevs_operational": 1, 00:23:18.682 "base_bdevs_list": [ 00:23:18.682 { 00:23:18.682 "name": null, 00:23:18.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.682 "is_configured": false, 00:23:18.682 "data_offset": 2048, 00:23:18.682 "data_size": 63488 00:23:18.682 }, 00:23:18.682 { 00:23:18.682 "name": "BaseBdev2", 00:23:18.682 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:18.682 "is_configured": true, 00:23:18.682 "data_offset": 2048, 00:23:18.682 "data_size": 63488 00:23:18.682 } 00:23:18.682 ] 00:23:18.682 }' 00:23:18.682 10:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.682 10:49:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:19.250 10:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:19.508 [2024-07-12 10:49:54.585298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:19.508 [2024-07-12 10:49:54.585353] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:19.508 [2024-07-12 10:49:54.585379] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17f3c80 00:23:19.508 [2024-07-12 10:49:54.585392] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:19.508 [2024-07-12 10:49:54.585779] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:19.508 [2024-07-12 10:49:54.585799] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:19.508 [2024-07-12 10:49:54.585880] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:19.508 [2024-07-12 10:49:54.585894] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:19.508 [2024-07-12 10:49:54.585905] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:19.508 [2024-07-12 10:49:54.585923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:19.508 [2024-07-12 10:49:54.590802] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f5ef0 00:23:19.508 spare 00:23:19.508 [2024-07-12 10:49:54.592290] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:19.508 10:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:20.441 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:20.441 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:20.441 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:20.441 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:20.441 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:20.441 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.441 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.698 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.698 "name": "raid_bdev1", 00:23:20.698 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:20.698 "strip_size_kb": 0, 00:23:20.698 "state": "online", 00:23:20.698 "raid_level": "raid1", 00:23:20.698 "superblock": true, 00:23:20.698 "num_base_bdevs": 2, 00:23:20.698 "num_base_bdevs_discovered": 2, 00:23:20.698 "num_base_bdevs_operational": 2, 00:23:20.698 "process": { 00:23:20.698 "type": "rebuild", 00:23:20.698 "target": "spare", 00:23:20.698 "progress": { 00:23:20.698 "blocks": 24576, 00:23:20.698 "percent": 38 00:23:20.698 } 00:23:20.698 }, 00:23:20.698 "base_bdevs_list": [ 00:23:20.698 { 00:23:20.698 "name": "spare", 00:23:20.698 "uuid": "abda3389-5df4-5b79-b0e2-0573e01490ad", 00:23:20.698 "is_configured": true, 00:23:20.698 "data_offset": 2048, 00:23:20.698 "data_size": 63488 00:23:20.698 }, 00:23:20.698 { 00:23:20.698 "name": "BaseBdev2", 00:23:20.698 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:20.698 "is_configured": true, 00:23:20.698 "data_offset": 2048, 00:23:20.698 "data_size": 63488 00:23:20.698 } 00:23:20.698 ] 00:23:20.698 }' 00:23:20.698 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.955 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:20.955 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.955 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:20.955 10:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:20.955 [2024-07-12 10:49:56.131067] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:21.212 [2024-07-12 10:49:56.204584] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:21.212 [2024-07-12 10:49:56.204638] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:21.212 [2024-07-12 10:49:56.204654] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:21.212 [2024-07-12 10:49:56.204662] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.212 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.777 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:21.777 "name": "raid_bdev1", 00:23:21.777 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:21.777 "strip_size_kb": 0, 00:23:21.777 "state": "online", 00:23:21.777 "raid_level": "raid1", 00:23:21.777 "superblock": true, 00:23:21.777 "num_base_bdevs": 2, 00:23:21.777 "num_base_bdevs_discovered": 1, 00:23:21.777 "num_base_bdevs_operational": 1, 00:23:21.777 "base_bdevs_list": [ 00:23:21.777 { 00:23:21.777 "name": null, 00:23:21.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:21.777 "is_configured": false, 00:23:21.777 "data_offset": 2048, 00:23:21.777 "data_size": 63488 00:23:21.777 }, 00:23:21.777 { 00:23:21.777 "name": "BaseBdev2", 00:23:21.777 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:21.777 "is_configured": true, 00:23:21.777 "data_offset": 2048, 00:23:21.777 "data_size": 63488 00:23:21.777 } 00:23:21.777 ] 00:23:21.777 }' 00:23:21.777 10:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:21.777 10:49:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:22.342 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:22.342 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:22.342 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:22.342 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:22.342 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:22.342 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:22.342 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.600 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:22.600 "name": "raid_bdev1", 00:23:22.600 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:22.600 "strip_size_kb": 0, 00:23:22.600 "state": "online", 00:23:22.600 "raid_level": "raid1", 00:23:22.600 "superblock": true, 00:23:22.600 "num_base_bdevs": 2, 00:23:22.600 "num_base_bdevs_discovered": 1, 00:23:22.600 "num_base_bdevs_operational": 1, 00:23:22.600 "base_bdevs_list": [ 00:23:22.600 { 00:23:22.600 "name": null, 00:23:22.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.600 "is_configured": false, 00:23:22.600 "data_offset": 2048, 00:23:22.600 "data_size": 63488 00:23:22.600 }, 00:23:22.600 { 00:23:22.600 "name": "BaseBdev2", 00:23:22.600 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:22.600 "is_configured": true, 00:23:22.600 "data_offset": 2048, 00:23:22.600 "data_size": 63488 00:23:22.600 } 00:23:22.600 ] 00:23:22.600 }' 00:23:22.600 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:22.600 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:22.600 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:22.600 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:22.600 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:22.857 10:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:23.115 [2024-07-12 10:49:58.138879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:23.115 [2024-07-12 10:49:58.138929] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.115 [2024-07-12 10:49:58.138957] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19ad570 00:23:23.115 [2024-07-12 10:49:58.138970] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.115 [2024-07-12 10:49:58.139325] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.115 [2024-07-12 10:49:58.139345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:23.115 [2024-07-12 10:49:58.139414] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:23.115 [2024-07-12 10:49:58.139427] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:23.115 [2024-07-12 10:49:58.139437] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:23.115 BaseBdev1 00:23:23.115 10:49:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.046 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.303 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.303 "name": "raid_bdev1", 00:23:24.303 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:24.303 "strip_size_kb": 0, 00:23:24.303 "state": "online", 00:23:24.303 "raid_level": "raid1", 00:23:24.303 "superblock": true, 00:23:24.303 "num_base_bdevs": 2, 00:23:24.303 "num_base_bdevs_discovered": 1, 00:23:24.303 "num_base_bdevs_operational": 1, 00:23:24.303 "base_bdevs_list": [ 00:23:24.303 { 00:23:24.303 "name": null, 00:23:24.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:24.303 "is_configured": false, 00:23:24.303 "data_offset": 2048, 00:23:24.303 "data_size": 63488 00:23:24.303 }, 00:23:24.303 { 00:23:24.303 "name": "BaseBdev2", 00:23:24.303 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:24.303 "is_configured": true, 00:23:24.303 "data_offset": 2048, 00:23:24.303 "data_size": 63488 00:23:24.303 } 00:23:24.303 ] 00:23:24.303 }' 00:23:24.303 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.303 10:49:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:24.866 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:24.866 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:24.866 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:24.866 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:24.866 10:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:24.866 10:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.866 10:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.122 10:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:25.122 "name": "raid_bdev1", 00:23:25.122 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:25.122 "strip_size_kb": 0, 00:23:25.122 "state": "online", 00:23:25.122 "raid_level": "raid1", 00:23:25.122 "superblock": true, 00:23:25.122 "num_base_bdevs": 2, 00:23:25.122 "num_base_bdevs_discovered": 1, 00:23:25.122 "num_base_bdevs_operational": 1, 00:23:25.122 "base_bdevs_list": [ 00:23:25.122 { 00:23:25.122 "name": null, 00:23:25.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:25.122 "is_configured": false, 00:23:25.122 "data_offset": 2048, 00:23:25.122 "data_size": 63488 00:23:25.123 }, 00:23:25.123 { 00:23:25.123 "name": "BaseBdev2", 00:23:25.123 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:25.123 "is_configured": true, 00:23:25.123 "data_offset": 2048, 00:23:25.123 "data_size": 63488 00:23:25.123 } 00:23:25.123 ] 00:23:25.123 }' 00:23:25.123 10:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:25.123 10:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:25.123 10:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:25.379 10:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:25.379 10:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:25.379 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:23:25.379 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:25.379 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:25.379 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:25.379 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:25.380 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:25.380 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:25.380 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:25.380 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:25.380 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:25.380 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:25.945 [2024-07-12 10:50:00.846080] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:25.945 [2024-07-12 10:50:00.846217] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:25.945 [2024-07-12 10:50:00.846234] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:25.945 request: 00:23:25.945 { 00:23:25.945 "base_bdev": "BaseBdev1", 00:23:25.945 "raid_bdev": "raid_bdev1", 00:23:25.945 "method": "bdev_raid_add_base_bdev", 00:23:25.945 "req_id": 1 00:23:25.945 } 00:23:25.945 Got JSON-RPC error response 00:23:25.945 response: 00:23:25.945 { 00:23:25.945 "code": -22, 00:23:25.945 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:25.945 } 00:23:25.945 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:23:25.945 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:25.945 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:25.945 10:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:25.945 10:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.878 10:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.137 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.137 "name": "raid_bdev1", 00:23:27.137 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:27.137 "strip_size_kb": 0, 00:23:27.137 "state": "online", 00:23:27.137 "raid_level": "raid1", 00:23:27.137 "superblock": true, 00:23:27.137 "num_base_bdevs": 2, 00:23:27.137 "num_base_bdevs_discovered": 1, 00:23:27.137 "num_base_bdevs_operational": 1, 00:23:27.137 "base_bdevs_list": [ 00:23:27.137 { 00:23:27.137 "name": null, 00:23:27.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.137 "is_configured": false, 00:23:27.137 "data_offset": 2048, 00:23:27.137 "data_size": 63488 00:23:27.137 }, 00:23:27.137 { 00:23:27.137 "name": "BaseBdev2", 00:23:27.137 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:27.137 "is_configured": true, 00:23:27.137 "data_offset": 2048, 00:23:27.137 "data_size": 63488 00:23:27.137 } 00:23:27.137 ] 00:23:27.137 }' 00:23:27.137 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.137 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:27.703 "name": "raid_bdev1", 00:23:27.703 "uuid": "5eb9de84-dbcd-4e44-bf8f-e30e8908c6ba", 00:23:27.703 "strip_size_kb": 0, 00:23:27.703 "state": "online", 00:23:27.703 "raid_level": "raid1", 00:23:27.703 "superblock": true, 00:23:27.703 "num_base_bdevs": 2, 00:23:27.703 "num_base_bdevs_discovered": 1, 00:23:27.703 "num_base_bdevs_operational": 1, 00:23:27.703 "base_bdevs_list": [ 00:23:27.703 { 00:23:27.703 "name": null, 00:23:27.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.703 "is_configured": false, 00:23:27.703 "data_offset": 2048, 00:23:27.703 "data_size": 63488 00:23:27.703 }, 00:23:27.703 { 00:23:27.703 "name": "BaseBdev2", 00:23:27.703 "uuid": "fc82cfe7-aac6-5246-9467-b61ae82ba19d", 00:23:27.703 "is_configured": true, 00:23:27.703 "data_offset": 2048, 00:23:27.703 "data_size": 63488 00:23:27.703 } 00:23:27.703 ] 00:23:27.703 }' 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:27.703 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2124448 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2124448 ']' 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2124448 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2124448 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2124448' 00:23:27.961 killing process with pid 2124448 00:23:27.961 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2124448 00:23:27.961 Received shutdown signal, test time was about 60.000000 seconds 00:23:27.961 00:23:27.961 Latency(us) 00:23:27.961 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:27.961 =================================================================================================================== 00:23:27.961 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:27.961 [2024-07-12 10:50:02.973845] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:27.961 [2024-07-12 10:50:02.973944] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:27.961 [2024-07-12 10:50:02.973990] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to fr 10:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2124448 00:23:27.961 ee all in destruct 00:23:27.961 [2024-07-12 10:50:02.974011] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19ad2f0 name raid_bdev1, state offline 00:23:27.961 [2024-07-12 10:50:03.005331] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:23:28.218 00:23:28.218 real 0m35.390s 00:23:28.218 user 0m51.952s 00:23:28.218 sys 0m6.288s 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:28.218 ************************************ 00:23:28.218 END TEST raid_rebuild_test_sb 00:23:28.218 ************************************ 00:23:28.218 10:50:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:28.218 10:50:03 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:23:28.218 10:50:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:28.218 10:50:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:28.218 10:50:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:28.218 ************************************ 00:23:28.218 START TEST raid_rebuild_test_io 00:23:28.218 ************************************ 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2129473 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2129473 /var/tmp/spdk-raid.sock 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2129473 ']' 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:28.218 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:28.218 10:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:28.475 [2024-07-12 10:50:03.418911] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:23:28.475 [2024-07-12 10:50:03.419047] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2129473 ] 00:23:28.475 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:28.475 Zero copy mechanism will not be used. 00:23:28.475 [2024-07-12 10:50:03.613918] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:28.732 [2024-07-12 10:50:03.711066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:28.732 [2024-07-12 10:50:03.772079] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:28.732 [2024-07-12 10:50:03.772125] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:29.297 10:50:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:29.297 10:50:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:23:29.297 10:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:29.297 10:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:29.554 BaseBdev1_malloc 00:23:29.554 10:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:29.812 [2024-07-12 10:50:04.780560] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:29.812 [2024-07-12 10:50:04.780606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.812 [2024-07-12 10:50:04.780632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfa4d40 00:23:29.812 [2024-07-12 10:50:04.780645] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.812 [2024-07-12 10:50:04.782408] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.812 [2024-07-12 10:50:04.782438] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:29.812 BaseBdev1 00:23:29.812 10:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:29.812 10:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:30.069 BaseBdev2_malloc 00:23:30.069 10:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:30.327 [2024-07-12 10:50:05.276045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:30.327 [2024-07-12 10:50:05.276092] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.327 [2024-07-12 10:50:05.276117] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfa5860 00:23:30.327 [2024-07-12 10:50:05.276129] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.327 [2024-07-12 10:50:05.277699] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.327 [2024-07-12 10:50:05.277742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:30.327 BaseBdev2 00:23:30.327 10:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:30.327 spare_malloc 00:23:30.584 10:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:30.584 spare_delay 00:23:30.584 10:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:30.842 [2024-07-12 10:50:05.998501] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:30.842 [2024-07-12 10:50:05.998546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.842 [2024-07-12 10:50:05.998565] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1153ec0 00:23:30.842 [2024-07-12 10:50:05.998579] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.842 [2024-07-12 10:50:06.000137] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.842 [2024-07-12 10:50:06.000168] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:30.842 spare 00:23:30.842 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:31.100 [2024-07-12 10:50:06.243158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:31.100 [2024-07-12 10:50:06.244524] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:31.100 [2024-07-12 10:50:06.244601] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1155070 00:23:31.100 [2024-07-12 10:50:06.244612] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:31.100 [2024-07-12 10:50:06.244826] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x114e490 00:23:31.100 [2024-07-12 10:50:06.244967] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1155070 00:23:31.100 [2024-07-12 10:50:06.244977] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1155070 00:23:31.100 [2024-07-12 10:50:06.245093] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.100 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.358 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:31.358 "name": "raid_bdev1", 00:23:31.358 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:31.358 "strip_size_kb": 0, 00:23:31.358 "state": "online", 00:23:31.358 "raid_level": "raid1", 00:23:31.358 "superblock": false, 00:23:31.358 "num_base_bdevs": 2, 00:23:31.358 "num_base_bdevs_discovered": 2, 00:23:31.358 "num_base_bdevs_operational": 2, 00:23:31.358 "base_bdevs_list": [ 00:23:31.358 { 00:23:31.358 "name": "BaseBdev1", 00:23:31.358 "uuid": "186aca26-b5c9-5369-996d-514ab7b24a36", 00:23:31.358 "is_configured": true, 00:23:31.358 "data_offset": 0, 00:23:31.358 "data_size": 65536 00:23:31.358 }, 00:23:31.358 { 00:23:31.358 "name": "BaseBdev2", 00:23:31.358 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:31.358 "is_configured": true, 00:23:31.358 "data_offset": 0, 00:23:31.358 "data_size": 65536 00:23:31.358 } 00:23:31.358 ] 00:23:31.358 }' 00:23:31.358 10:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:31.358 10:50:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:31.923 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:31.923 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:32.180 [2024-07-12 10:50:07.322252] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:32.180 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:32.180 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:32.180 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.438 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:32.438 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:32.438 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:32.438 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:32.696 [2024-07-12 10:50:07.809386] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x114fbd0 00:23:32.696 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:32.696 Zero copy mechanism will not be used. 00:23:32.696 Running I/O for 60 seconds... 00:23:32.696 [2024-07-12 10:50:07.822356] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:32.696 [2024-07-12 10:50:07.830474] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x114fbd0 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.696 10:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.955 10:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:32.955 "name": "raid_bdev1", 00:23:32.955 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:32.955 "strip_size_kb": 0, 00:23:32.955 "state": "online", 00:23:32.955 "raid_level": "raid1", 00:23:32.955 "superblock": false, 00:23:32.955 "num_base_bdevs": 2, 00:23:32.955 "num_base_bdevs_discovered": 1, 00:23:32.955 "num_base_bdevs_operational": 1, 00:23:32.955 "base_bdevs_list": [ 00:23:32.955 { 00:23:32.955 "name": null, 00:23:32.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.955 "is_configured": false, 00:23:32.955 "data_offset": 0, 00:23:32.955 "data_size": 65536 00:23:32.955 }, 00:23:32.955 { 00:23:32.955 "name": "BaseBdev2", 00:23:32.955 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:32.955 "is_configured": true, 00:23:32.955 "data_offset": 0, 00:23:32.955 "data_size": 65536 00:23:32.955 } 00:23:32.955 ] 00:23:32.955 }' 00:23:32.955 10:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:32.955 10:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:33.927 10:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:33.927 [2024-07-12 10:50:08.965266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:33.927 10:50:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:33.927 [2024-07-12 10:50:09.048726] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10d78b0 00:23:33.927 [2024-07-12 10:50:09.051099] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:34.185 [2024-07-12 10:50:09.177250] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:34.185 [2024-07-12 10:50:09.177750] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:34.443 [2024-07-12 10:50:09.406404] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:34.443 [2024-07-12 10:50:09.406661] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:35.008 [2024-07-12 10:50:09.901108] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:35.008 [2024-07-12 10:50:09.901292] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:35.008 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:35.008 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.008 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:35.008 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:35.008 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.008 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.008 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.265 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.265 "name": "raid_bdev1", 00:23:35.265 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:35.265 "strip_size_kb": 0, 00:23:35.265 "state": "online", 00:23:35.265 "raid_level": "raid1", 00:23:35.265 "superblock": false, 00:23:35.265 "num_base_bdevs": 2, 00:23:35.265 "num_base_bdevs_discovered": 2, 00:23:35.265 "num_base_bdevs_operational": 2, 00:23:35.265 "process": { 00:23:35.265 "type": "rebuild", 00:23:35.265 "target": "spare", 00:23:35.265 "progress": { 00:23:35.265 "blocks": 12288, 00:23:35.265 "percent": 18 00:23:35.265 } 00:23:35.265 }, 00:23:35.265 "base_bdevs_list": [ 00:23:35.265 { 00:23:35.265 "name": "spare", 00:23:35.265 "uuid": "aeee2167-dbc2-5f6a-bb3a-c50926f02ada", 00:23:35.265 "is_configured": true, 00:23:35.265 "data_offset": 0, 00:23:35.265 "data_size": 65536 00:23:35.265 }, 00:23:35.265 { 00:23:35.265 "name": "BaseBdev2", 00:23:35.265 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:35.265 "is_configured": true, 00:23:35.265 "data_offset": 0, 00:23:35.265 "data_size": 65536 00:23:35.265 } 00:23:35.265 ] 00:23:35.265 }' 00:23:35.265 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.265 [2024-07-12 10:50:10.267512] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:35.265 [2024-07-12 10:50:10.267916] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:35.265 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:35.266 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.266 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:35.266 10:50:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:35.524 [2024-07-12 10:50:10.487395] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:35.782 [2024-07-12 10:50:10.803012] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:35.782 [2024-07-12 10:50:10.851517] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:35.782 [2024-07-12 10:50:10.851690] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:35.782 [2024-07-12 10:50:10.961102] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:35.782 [2024-07-12 10:50:10.962855] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:35.782 [2024-07-12 10:50:10.962881] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:35.782 [2024-07-12 10:50:10.962891] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:35.782 [2024-07-12 10:50:10.976628] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x114fbd0 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.040 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.305 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.305 "name": "raid_bdev1", 00:23:36.305 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:36.305 "strip_size_kb": 0, 00:23:36.305 "state": "online", 00:23:36.305 "raid_level": "raid1", 00:23:36.305 "superblock": false, 00:23:36.305 "num_base_bdevs": 2, 00:23:36.305 "num_base_bdevs_discovered": 1, 00:23:36.305 "num_base_bdevs_operational": 1, 00:23:36.305 "base_bdevs_list": [ 00:23:36.305 { 00:23:36.305 "name": null, 00:23:36.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.305 "is_configured": false, 00:23:36.305 "data_offset": 0, 00:23:36.305 "data_size": 65536 00:23:36.305 }, 00:23:36.305 { 00:23:36.305 "name": "BaseBdev2", 00:23:36.305 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:36.305 "is_configured": true, 00:23:36.305 "data_offset": 0, 00:23:36.305 "data_size": 65536 00:23:36.305 } 00:23:36.305 ] 00:23:36.305 }' 00:23:36.305 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.305 10:50:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:36.871 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:36.871 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.871 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:36.871 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:36.871 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.871 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.871 10:50:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.129 10:50:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:37.129 "name": "raid_bdev1", 00:23:37.129 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:37.129 "strip_size_kb": 0, 00:23:37.129 "state": "online", 00:23:37.129 "raid_level": "raid1", 00:23:37.129 "superblock": false, 00:23:37.129 "num_base_bdevs": 2, 00:23:37.129 "num_base_bdevs_discovered": 1, 00:23:37.129 "num_base_bdevs_operational": 1, 00:23:37.129 "base_bdevs_list": [ 00:23:37.129 { 00:23:37.129 "name": null, 00:23:37.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.129 "is_configured": false, 00:23:37.129 "data_offset": 0, 00:23:37.129 "data_size": 65536 00:23:37.129 }, 00:23:37.129 { 00:23:37.129 "name": "BaseBdev2", 00:23:37.129 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:37.129 "is_configured": true, 00:23:37.129 "data_offset": 0, 00:23:37.129 "data_size": 65536 00:23:37.129 } 00:23:37.129 ] 00:23:37.129 }' 00:23:37.129 10:50:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:37.129 10:50:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:37.129 10:50:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:37.387 10:50:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:37.387 10:50:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:37.387 [2024-07-12 10:50:12.554682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:37.644 [2024-07-12 10:50:12.598901] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1155450 00:23:37.644 [2024-07-12 10:50:12.600424] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:37.644 10:50:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:37.644 [2024-07-12 10:50:12.727584] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:37.644 [2024-07-12 10:50:12.727904] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:37.901 [2024-07-12 10:50:12.931153] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:37.901 [2024-07-12 10:50:12.931370] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:38.158 [2024-07-12 10:50:13.271509] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:38.158 [2024-07-12 10:50:13.271833] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:38.415 [2024-07-12 10:50:13.473598] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:38.415 [2024-07-12 10:50:13.473762] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:38.672 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:38.672 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:38.672 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:38.672 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:38.672 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:38.672 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.672 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.672 [2024-07-12 10:50:13.829775] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:38.929 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:38.929 "name": "raid_bdev1", 00:23:38.929 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:38.930 "strip_size_kb": 0, 00:23:38.930 "state": "online", 00:23:38.930 "raid_level": "raid1", 00:23:38.930 "superblock": false, 00:23:38.930 "num_base_bdevs": 2, 00:23:38.930 "num_base_bdevs_discovered": 2, 00:23:38.930 "num_base_bdevs_operational": 2, 00:23:38.930 "process": { 00:23:38.930 "type": "rebuild", 00:23:38.930 "target": "spare", 00:23:38.930 "progress": { 00:23:38.930 "blocks": 14336, 00:23:38.930 "percent": 21 00:23:38.930 } 00:23:38.930 }, 00:23:38.930 "base_bdevs_list": [ 00:23:38.930 { 00:23:38.930 "name": "spare", 00:23:38.930 "uuid": "aeee2167-dbc2-5f6a-bb3a-c50926f02ada", 00:23:38.930 "is_configured": true, 00:23:38.930 "data_offset": 0, 00:23:38.930 "data_size": 65536 00:23:38.930 }, 00:23:38.930 { 00:23:38.930 "name": "BaseBdev2", 00:23:38.930 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:38.930 "is_configured": true, 00:23:38.930 "data_offset": 0, 00:23:38.930 "data_size": 65536 00:23:38.930 } 00:23:38.930 ] 00:23:38.930 }' 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:38.930 [2024-07-12 10:50:13.951198] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=809 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.930 10:50:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.187 10:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.187 "name": "raid_bdev1", 00:23:39.187 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:39.187 "strip_size_kb": 0, 00:23:39.187 "state": "online", 00:23:39.187 "raid_level": "raid1", 00:23:39.187 "superblock": false, 00:23:39.187 "num_base_bdevs": 2, 00:23:39.187 "num_base_bdevs_discovered": 2, 00:23:39.187 "num_base_bdevs_operational": 2, 00:23:39.187 "process": { 00:23:39.187 "type": "rebuild", 00:23:39.187 "target": "spare", 00:23:39.187 "progress": { 00:23:39.187 "blocks": 18432, 00:23:39.187 "percent": 28 00:23:39.187 } 00:23:39.187 }, 00:23:39.187 "base_bdevs_list": [ 00:23:39.187 { 00:23:39.187 "name": "spare", 00:23:39.187 "uuid": "aeee2167-dbc2-5f6a-bb3a-c50926f02ada", 00:23:39.187 "is_configured": true, 00:23:39.187 "data_offset": 0, 00:23:39.187 "data_size": 65536 00:23:39.187 }, 00:23:39.187 { 00:23:39.187 "name": "BaseBdev2", 00:23:39.187 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:39.187 "is_configured": true, 00:23:39.187 "data_offset": 0, 00:23:39.187 "data_size": 65536 00:23:39.187 } 00:23:39.187 ] 00:23:39.187 }' 00:23:39.187 10:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.187 10:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:39.187 10:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.187 10:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:39.187 10:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:39.445 [2024-07-12 10:50:14.383609] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:39.445 [2024-07-12 10:50:14.630955] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:39.703 [2024-07-12 10:50:14.842141] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:40.268 [2024-07-12 10:50:15.171495] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:23:40.268 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:40.268 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:40.268 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:40.268 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:40.268 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:40.268 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:40.268 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.268 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.526 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:40.526 "name": "raid_bdev1", 00:23:40.526 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:40.526 "strip_size_kb": 0, 00:23:40.526 "state": "online", 00:23:40.526 "raid_level": "raid1", 00:23:40.526 "superblock": false, 00:23:40.526 "num_base_bdevs": 2, 00:23:40.526 "num_base_bdevs_discovered": 2, 00:23:40.526 "num_base_bdevs_operational": 2, 00:23:40.526 "process": { 00:23:40.526 "type": "rebuild", 00:23:40.526 "target": "spare", 00:23:40.526 "progress": { 00:23:40.526 "blocks": 36864, 00:23:40.526 "percent": 56 00:23:40.526 } 00:23:40.526 }, 00:23:40.526 "base_bdevs_list": [ 00:23:40.526 { 00:23:40.526 "name": "spare", 00:23:40.526 "uuid": "aeee2167-dbc2-5f6a-bb3a-c50926f02ada", 00:23:40.526 "is_configured": true, 00:23:40.526 "data_offset": 0, 00:23:40.526 "data_size": 65536 00:23:40.526 }, 00:23:40.526 { 00:23:40.526 "name": "BaseBdev2", 00:23:40.526 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:40.526 "is_configured": true, 00:23:40.526 "data_offset": 0, 00:23:40.526 "data_size": 65536 00:23:40.526 } 00:23:40.526 ] 00:23:40.526 }' 00:23:40.526 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:40.526 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:40.526 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:40.526 [2024-07-12 10:50:15.610281] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:23:40.526 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:40.526 10:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:40.783 [2024-07-12 10:50:15.728723] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:41.717 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:41.717 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:41.717 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:41.717 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:41.717 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:41.717 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:41.717 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.717 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:41.717 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:41.717 "name": "raid_bdev1", 00:23:41.717 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:41.717 "strip_size_kb": 0, 00:23:41.717 "state": "online", 00:23:41.717 "raid_level": "raid1", 00:23:41.717 "superblock": false, 00:23:41.717 "num_base_bdevs": 2, 00:23:41.717 "num_base_bdevs_discovered": 2, 00:23:41.717 "num_base_bdevs_operational": 2, 00:23:41.717 "process": { 00:23:41.717 "type": "rebuild", 00:23:41.717 "target": "spare", 00:23:41.717 "progress": { 00:23:41.717 "blocks": 59392, 00:23:41.717 "percent": 90 00:23:41.717 } 00:23:41.717 }, 00:23:41.717 "base_bdevs_list": [ 00:23:41.717 { 00:23:41.717 "name": "spare", 00:23:41.717 "uuid": "aeee2167-dbc2-5f6a-bb3a-c50926f02ada", 00:23:41.718 "is_configured": true, 00:23:41.718 "data_offset": 0, 00:23:41.718 "data_size": 65536 00:23:41.718 }, 00:23:41.718 { 00:23:41.718 "name": "BaseBdev2", 00:23:41.718 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:41.718 "is_configured": true, 00:23:41.718 "data_offset": 0, 00:23:41.718 "data_size": 65536 00:23:41.718 } 00:23:41.718 ] 00:23:41.718 }' 00:23:41.718 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:41.718 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:41.718 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:41.975 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:41.975 10:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:41.975 [2024-07-12 10:50:17.166147] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:42.233 [2024-07-12 10:50:17.274386] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:42.233 [2024-07-12 10:50:17.275958] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:42.799 10:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:42.799 10:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:42.799 10:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:42.799 10:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:42.799 10:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:42.799 10:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:42.799 10:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.799 10:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.056 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:43.056 "name": "raid_bdev1", 00:23:43.056 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:43.056 "strip_size_kb": 0, 00:23:43.056 "state": "online", 00:23:43.056 "raid_level": "raid1", 00:23:43.056 "superblock": false, 00:23:43.056 "num_base_bdevs": 2, 00:23:43.056 "num_base_bdevs_discovered": 2, 00:23:43.056 "num_base_bdevs_operational": 2, 00:23:43.056 "base_bdevs_list": [ 00:23:43.056 { 00:23:43.056 "name": "spare", 00:23:43.056 "uuid": "aeee2167-dbc2-5f6a-bb3a-c50926f02ada", 00:23:43.056 "is_configured": true, 00:23:43.056 "data_offset": 0, 00:23:43.056 "data_size": 65536 00:23:43.056 }, 00:23:43.056 { 00:23:43.056 "name": "BaseBdev2", 00:23:43.056 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:43.056 "is_configured": true, 00:23:43.056 "data_offset": 0, 00:23:43.056 "data_size": 65536 00:23:43.056 } 00:23:43.056 ] 00:23:43.056 }' 00:23:43.056 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:43.056 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:43.056 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:43.314 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:43.314 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:23:43.314 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:43.314 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:43.314 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:43.314 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:43.314 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:43.314 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.314 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:43.572 "name": "raid_bdev1", 00:23:43.572 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:43.572 "strip_size_kb": 0, 00:23:43.572 "state": "online", 00:23:43.572 "raid_level": "raid1", 00:23:43.572 "superblock": false, 00:23:43.572 "num_base_bdevs": 2, 00:23:43.572 "num_base_bdevs_discovered": 2, 00:23:43.572 "num_base_bdevs_operational": 2, 00:23:43.572 "base_bdevs_list": [ 00:23:43.572 { 00:23:43.572 "name": "spare", 00:23:43.572 "uuid": "aeee2167-dbc2-5f6a-bb3a-c50926f02ada", 00:23:43.572 "is_configured": true, 00:23:43.572 "data_offset": 0, 00:23:43.572 "data_size": 65536 00:23:43.572 }, 00:23:43.572 { 00:23:43.572 "name": "BaseBdev2", 00:23:43.572 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:43.572 "is_configured": true, 00:23:43.572 "data_offset": 0, 00:23:43.572 "data_size": 65536 00:23:43.572 } 00:23:43.572 ] 00:23:43.572 }' 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.572 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.830 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.830 "name": "raid_bdev1", 00:23:43.830 "uuid": "5859333e-f984-4e3b-b9bb-017437498c44", 00:23:43.830 "strip_size_kb": 0, 00:23:43.830 "state": "online", 00:23:43.830 "raid_level": "raid1", 00:23:43.830 "superblock": false, 00:23:43.830 "num_base_bdevs": 2, 00:23:43.830 "num_base_bdevs_discovered": 2, 00:23:43.830 "num_base_bdevs_operational": 2, 00:23:43.830 "base_bdevs_list": [ 00:23:43.830 { 00:23:43.830 "name": "spare", 00:23:43.830 "uuid": "aeee2167-dbc2-5f6a-bb3a-c50926f02ada", 00:23:43.830 "is_configured": true, 00:23:43.830 "data_offset": 0, 00:23:43.830 "data_size": 65536 00:23:43.830 }, 00:23:43.830 { 00:23:43.830 "name": "BaseBdev2", 00:23:43.830 "uuid": "a5dce0c5-834e-5d0f-bb05-af824451c00d", 00:23:43.830 "is_configured": true, 00:23:43.830 "data_offset": 0, 00:23:43.830 "data_size": 65536 00:23:43.830 } 00:23:43.830 ] 00:23:43.830 }' 00:23:43.830 10:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.830 10:50:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:44.766 10:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:44.766 [2024-07-12 10:50:19.932182] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:44.766 [2024-07-12 10:50:19.932216] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:45.022 00:23:45.022 Latency(us) 00:23:45.022 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:45.022 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:45.022 raid_bdev1 : 12.19 91.05 273.15 0.00 0.00 14770.62 290.28 119446.48 00:23:45.022 =================================================================================================================== 00:23:45.022 Total : 91.05 273.15 0.00 0.00 14770.62 290.28 119446.48 00:23:45.022 [2024-07-12 10:50:20.036546] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:45.022 [2024-07-12 10:50:20.036573] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:45.022 [2024-07-12 10:50:20.036646] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:45.022 [2024-07-12 10:50:20.036658] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1155070 name raid_bdev1, state offline 00:23:45.022 0 00:23:45.022 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.023 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:45.279 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:45.536 /dev/nbd0 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:45.536 1+0 records in 00:23:45.536 1+0 records out 00:23:45.536 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279373 s, 14.7 MB/s 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:45.536 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:45.793 /dev/nbd1 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:45.793 1+0 records in 00:23:45.793 1+0 records out 00:23:45.793 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289888 s, 14.1 MB/s 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:45.793 10:50:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:46.050 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2129473 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2129473 ']' 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2129473 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2129473 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2129473' 00:23:46.326 killing process with pid 2129473 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2129473 00:23:46.326 Received shutdown signal, test time was about 13.627739 seconds 00:23:46.326 00:23:46.326 Latency(us) 00:23:46.326 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:46.326 =================================================================================================================== 00:23:46.326 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:46.326 [2024-07-12 10:50:21.472624] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:46.326 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2129473 00:23:46.326 [2024-07-12 10:50:21.493324] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:46.584 10:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:46.584 00:23:46.584 real 0m18.397s 00:23:46.584 user 0m28.112s 00:23:46.584 sys 0m2.982s 00:23:46.584 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:46.584 10:50:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:46.584 ************************************ 00:23:46.584 END TEST raid_rebuild_test_io 00:23:46.584 ************************************ 00:23:46.584 10:50:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:46.584 10:50:21 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:23:46.584 10:50:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:46.584 10:50:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:46.584 10:50:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:46.842 ************************************ 00:23:46.842 START TEST raid_rebuild_test_sb_io 00:23:46.842 ************************************ 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2131996 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2131996 /var/tmp/spdk-raid.sock 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2131996 ']' 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:46.842 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:46.842 10:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:46.842 [2024-07-12 10:50:21.848877] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:23:46.842 [2024-07-12 10:50:21.848939] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2131996 ] 00:23:46.842 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:46.842 Zero copy mechanism will not be used. 00:23:46.842 [2024-07-12 10:50:21.978021] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:47.100 [2024-07-12 10:50:22.086953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:47.100 [2024-07-12 10:50:22.153607] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:47.100 [2024-07-12 10:50:22.153636] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:47.357 10:50:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:47.357 10:50:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:23:47.357 10:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:47.357 10:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:47.357 BaseBdev1_malloc 00:23:47.614 10:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:47.908 [2024-07-12 10:50:23.043214] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:47.908 [2024-07-12 10:50:23.043265] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:47.908 [2024-07-12 10:50:23.043289] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb46d40 00:23:47.908 [2024-07-12 10:50:23.043302] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:47.908 [2024-07-12 10:50:23.045035] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:47.908 [2024-07-12 10:50:23.045065] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:47.908 BaseBdev1 00:23:48.179 10:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:48.179 10:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:48.179 BaseBdev2_malloc 00:23:48.179 10:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:48.437 [2024-07-12 10:50:23.481141] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:48.437 [2024-07-12 10:50:23.481187] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:48.437 [2024-07-12 10:50:23.481209] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb47860 00:23:48.437 [2024-07-12 10:50:23.481224] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:48.437 [2024-07-12 10:50:23.482763] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:48.437 [2024-07-12 10:50:23.482792] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:48.437 BaseBdev2 00:23:48.437 10:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:48.694 spare_malloc 00:23:48.694 10:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:48.952 spare_delay 00:23:48.952 10:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:49.209 [2024-07-12 10:50:24.195572] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:49.209 [2024-07-12 10:50:24.195620] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:49.209 [2024-07-12 10:50:24.195642] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcf5ec0 00:23:49.209 [2024-07-12 10:50:24.195654] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:49.209 [2024-07-12 10:50:24.197262] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:49.209 [2024-07-12 10:50:24.197295] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:49.209 spare 00:23:49.209 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:49.467 [2024-07-12 10:50:24.440242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:49.467 [2024-07-12 10:50:24.441599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:49.467 [2024-07-12 10:50:24.441767] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcf7070 00:23:49.467 [2024-07-12 10:50:24.441780] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:49.467 [2024-07-12 10:50:24.441981] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcf0490 00:23:49.467 [2024-07-12 10:50:24.442123] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcf7070 00:23:49.467 [2024-07-12 10:50:24.442133] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcf7070 00:23:49.467 [2024-07-12 10:50:24.442236] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.467 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.725 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:49.725 "name": "raid_bdev1", 00:23:49.725 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:23:49.725 "strip_size_kb": 0, 00:23:49.725 "state": "online", 00:23:49.725 "raid_level": "raid1", 00:23:49.725 "superblock": true, 00:23:49.725 "num_base_bdevs": 2, 00:23:49.725 "num_base_bdevs_discovered": 2, 00:23:49.725 "num_base_bdevs_operational": 2, 00:23:49.725 "base_bdevs_list": [ 00:23:49.725 { 00:23:49.725 "name": "BaseBdev1", 00:23:49.725 "uuid": "29d850fb-eaed-5a23-8318-c898a8dd4b7b", 00:23:49.725 "is_configured": true, 00:23:49.725 "data_offset": 2048, 00:23:49.725 "data_size": 63488 00:23:49.725 }, 00:23:49.725 { 00:23:49.725 "name": "BaseBdev2", 00:23:49.725 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:23:49.725 "is_configured": true, 00:23:49.725 "data_offset": 2048, 00:23:49.725 "data_size": 63488 00:23:49.725 } 00:23:49.725 ] 00:23:49.725 }' 00:23:49.725 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:49.725 10:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:50.289 10:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:50.289 10:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:50.546 [2024-07-12 10:50:25.507283] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:50.546 10:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:50.546 10:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.547 10:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:50.804 10:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:50.804 10:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:50.804 10:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:50.804 10:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:50.804 [2024-07-12 10:50:25.890138] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcf7c50 00:23:50.804 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:50.804 Zero copy mechanism will not be used. 00:23:50.804 Running I/O for 60 seconds... 00:23:51.063 [2024-07-12 10:50:26.007308] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:51.063 [2024-07-12 10:50:26.023496] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xcf7c50 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.063 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.321 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:51.321 "name": "raid_bdev1", 00:23:51.321 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:23:51.321 "strip_size_kb": 0, 00:23:51.321 "state": "online", 00:23:51.321 "raid_level": "raid1", 00:23:51.321 "superblock": true, 00:23:51.321 "num_base_bdevs": 2, 00:23:51.321 "num_base_bdevs_discovered": 1, 00:23:51.321 "num_base_bdevs_operational": 1, 00:23:51.321 "base_bdevs_list": [ 00:23:51.321 { 00:23:51.321 "name": null, 00:23:51.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.321 "is_configured": false, 00:23:51.321 "data_offset": 2048, 00:23:51.321 "data_size": 63488 00:23:51.321 }, 00:23:51.321 { 00:23:51.321 "name": "BaseBdev2", 00:23:51.321 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:23:51.321 "is_configured": true, 00:23:51.321 "data_offset": 2048, 00:23:51.321 "data_size": 63488 00:23:51.321 } 00:23:51.321 ] 00:23:51.321 }' 00:23:51.321 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:51.321 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:51.886 10:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:52.144 [2024-07-12 10:50:27.170323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:52.144 10:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:52.144 [2024-07-12 10:50:27.237989] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc63230 00:23:52.144 [2024-07-12 10:50:27.240353] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:52.402 [2024-07-12 10:50:27.359051] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:52.402 [2024-07-12 10:50:27.359436] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:52.402 [2024-07-12 10:50:27.470168] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:52.402 [2024-07-12 10:50:27.470316] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:52.660 [2024-07-12 10:50:27.733766] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:52.918 [2024-07-12 10:50:27.958741] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:53.177 [2024-07-12 10:50:28.223227] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:53.177 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:53.177 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:53.177 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:53.177 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:53.177 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:53.177 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.177 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.434 [2024-07-12 10:50:28.475517] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:53.434 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:53.434 "name": "raid_bdev1", 00:23:53.434 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:23:53.434 "strip_size_kb": 0, 00:23:53.434 "state": "online", 00:23:53.434 "raid_level": "raid1", 00:23:53.434 "superblock": true, 00:23:53.434 "num_base_bdevs": 2, 00:23:53.434 "num_base_bdevs_discovered": 2, 00:23:53.434 "num_base_bdevs_operational": 2, 00:23:53.434 "process": { 00:23:53.434 "type": "rebuild", 00:23:53.434 "target": "spare", 00:23:53.434 "progress": { 00:23:53.434 "blocks": 14336, 00:23:53.434 "percent": 22 00:23:53.434 } 00:23:53.434 }, 00:23:53.434 "base_bdevs_list": [ 00:23:53.434 { 00:23:53.434 "name": "spare", 00:23:53.434 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:23:53.434 "is_configured": true, 00:23:53.434 "data_offset": 2048, 00:23:53.434 "data_size": 63488 00:23:53.434 }, 00:23:53.434 { 00:23:53.434 "name": "BaseBdev2", 00:23:53.434 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:23:53.434 "is_configured": true, 00:23:53.435 "data_offset": 2048, 00:23:53.435 "data_size": 63488 00:23:53.435 } 00:23:53.435 ] 00:23:53.435 }' 00:23:53.435 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:53.435 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:53.435 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:53.435 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:53.435 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:53.692 [2024-07-12 10:50:28.795035] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:53.692 [2024-07-12 10:50:28.824859] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:53.692 [2024-07-12 10:50:28.843167] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:53.692 [2024-07-12 10:50:28.845031] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:53.692 [2024-07-12 10:50:28.845060] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:53.692 [2024-07-12 10:50:28.845071] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:53.692 [2024-07-12 10:50:28.852025] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xcf7c50 00:23:53.693 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:53.693 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:53.693 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:53.693 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:53.693 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:53.693 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:53.951 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:53.951 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:53.951 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:53.951 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:53.951 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.951 10:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.951 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.951 "name": "raid_bdev1", 00:23:53.951 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:23:53.951 "strip_size_kb": 0, 00:23:53.951 "state": "online", 00:23:53.951 "raid_level": "raid1", 00:23:53.951 "superblock": true, 00:23:53.951 "num_base_bdevs": 2, 00:23:53.951 "num_base_bdevs_discovered": 1, 00:23:53.951 "num_base_bdevs_operational": 1, 00:23:53.951 "base_bdevs_list": [ 00:23:53.951 { 00:23:53.951 "name": null, 00:23:53.951 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:53.951 "is_configured": false, 00:23:53.951 "data_offset": 2048, 00:23:53.951 "data_size": 63488 00:23:53.951 }, 00:23:53.951 { 00:23:53.951 "name": "BaseBdev2", 00:23:53.951 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:23:53.951 "is_configured": true, 00:23:53.951 "data_offset": 2048, 00:23:53.951 "data_size": 63488 00:23:53.951 } 00:23:53.951 ] 00:23:53.951 }' 00:23:53.951 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.951 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:54.884 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:54.884 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.884 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:54.884 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:54.884 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.884 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.884 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.884 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:54.884 "name": "raid_bdev1", 00:23:54.884 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:23:54.884 "strip_size_kb": 0, 00:23:54.884 "state": "online", 00:23:54.884 "raid_level": "raid1", 00:23:54.884 "superblock": true, 00:23:54.884 "num_base_bdevs": 2, 00:23:54.884 "num_base_bdevs_discovered": 1, 00:23:54.884 "num_base_bdevs_operational": 1, 00:23:54.884 "base_bdevs_list": [ 00:23:54.884 { 00:23:54.884 "name": null, 00:23:54.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:54.884 "is_configured": false, 00:23:54.884 "data_offset": 2048, 00:23:54.884 "data_size": 63488 00:23:54.884 }, 00:23:54.884 { 00:23:54.884 "name": "BaseBdev2", 00:23:54.884 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:23:54.884 "is_configured": true, 00:23:54.884 "data_offset": 2048, 00:23:54.884 "data_size": 63488 00:23:54.884 } 00:23:54.884 ] 00:23:54.884 }' 00:23:54.884 10:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:54.884 10:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:54.884 10:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:54.884 10:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:54.884 10:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:55.142 [2024-07-12 10:50:30.287800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:55.399 10:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:55.399 [2024-07-12 10:50:30.355026] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcf7e60 00:23:55.399 [2024-07-12 10:50:30.356498] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:55.399 [2024-07-12 10:50:30.463236] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:55.399 [2024-07-12 10:50:30.463562] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:55.657 [2024-07-12 10:50:30.665493] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:55.657 [2024-07-12 10:50:30.665702] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:55.914 [2024-07-12 10:50:31.039331] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:56.172 [2024-07-12 10:50:31.183303] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:56.172 [2024-07-12 10:50:31.183545] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:56.172 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:56.172 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:56.172 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:56.172 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:56.172 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:56.172 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.172 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.430 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:56.430 "name": "raid_bdev1", 00:23:56.430 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:23:56.430 "strip_size_kb": 0, 00:23:56.430 "state": "online", 00:23:56.430 "raid_level": "raid1", 00:23:56.430 "superblock": true, 00:23:56.430 "num_base_bdevs": 2, 00:23:56.430 "num_base_bdevs_discovered": 2, 00:23:56.430 "num_base_bdevs_operational": 2, 00:23:56.430 "process": { 00:23:56.430 "type": "rebuild", 00:23:56.430 "target": "spare", 00:23:56.430 "progress": { 00:23:56.430 "blocks": 14336, 00:23:56.430 "percent": 22 00:23:56.430 } 00:23:56.430 }, 00:23:56.430 "base_bdevs_list": [ 00:23:56.430 { 00:23:56.430 "name": "spare", 00:23:56.430 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:23:56.430 "is_configured": true, 00:23:56.430 "data_offset": 2048, 00:23:56.430 "data_size": 63488 00:23:56.430 }, 00:23:56.430 { 00:23:56.430 "name": "BaseBdev2", 00:23:56.430 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:23:56.430 "is_configured": true, 00:23:56.430 "data_offset": 2048, 00:23:56.430 "data_size": 63488 00:23:56.430 } 00:23:56.430 ] 00:23:56.430 }' 00:23:56.430 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.688 [2024-07-12 10:50:31.669028] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:56.688 [2024-07-12 10:50:31.669276] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:56.688 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=827 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.688 10:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.947 [2024-07-12 10:50:32.000103] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:56.947 [2024-07-12 10:50:32.128276] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:57.206 10:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:57.206 "name": "raid_bdev1", 00:23:57.206 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:23:57.206 "strip_size_kb": 0, 00:23:57.206 "state": "online", 00:23:57.206 "raid_level": "raid1", 00:23:57.206 "superblock": true, 00:23:57.206 "num_base_bdevs": 2, 00:23:57.206 "num_base_bdevs_discovered": 2, 00:23:57.206 "num_base_bdevs_operational": 2, 00:23:57.206 "process": { 00:23:57.206 "type": "rebuild", 00:23:57.206 "target": "spare", 00:23:57.206 "progress": { 00:23:57.206 "blocks": 22528, 00:23:57.206 "percent": 35 00:23:57.206 } 00:23:57.206 }, 00:23:57.206 "base_bdevs_list": [ 00:23:57.206 { 00:23:57.206 "name": "spare", 00:23:57.206 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:23:57.206 "is_configured": true, 00:23:57.206 "data_offset": 2048, 00:23:57.206 "data_size": 63488 00:23:57.206 }, 00:23:57.206 { 00:23:57.206 "name": "BaseBdev2", 00:23:57.206 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:23:57.206 "is_configured": true, 00:23:57.206 "data_offset": 2048, 00:23:57.206 "data_size": 63488 00:23:57.206 } 00:23:57.206 ] 00:23:57.206 }' 00:23:57.206 10:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:57.206 10:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:57.206 10:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:57.206 10:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:57.206 10:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:57.464 [2024-07-12 10:50:32.475884] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:57.464 [2024-07-12 10:50:32.476115] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:57.723 [2024-07-12 10:50:32.841830] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:23:57.988 [2024-07-12 10:50:33.069171] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:58.250 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:58.250 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:58.250 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:58.250 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:58.250 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:58.250 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:58.250 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.250 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.250 [2024-07-12 10:50:33.409447] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:23:58.508 [2024-07-12 10:50:33.520840] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:58.508 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:58.508 "name": "raid_bdev1", 00:23:58.508 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:23:58.508 "strip_size_kb": 0, 00:23:58.508 "state": "online", 00:23:58.508 "raid_level": "raid1", 00:23:58.508 "superblock": true, 00:23:58.508 "num_base_bdevs": 2, 00:23:58.508 "num_base_bdevs_discovered": 2, 00:23:58.508 "num_base_bdevs_operational": 2, 00:23:58.508 "process": { 00:23:58.508 "type": "rebuild", 00:23:58.508 "target": "spare", 00:23:58.508 "progress": { 00:23:58.508 "blocks": 40960, 00:23:58.508 "percent": 64 00:23:58.508 } 00:23:58.508 }, 00:23:58.508 "base_bdevs_list": [ 00:23:58.508 { 00:23:58.508 "name": "spare", 00:23:58.508 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:23:58.508 "is_configured": true, 00:23:58.508 "data_offset": 2048, 00:23:58.508 "data_size": 63488 00:23:58.508 }, 00:23:58.508 { 00:23:58.508 "name": "BaseBdev2", 00:23:58.508 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:23:58.508 "is_configured": true, 00:23:58.508 "data_offset": 2048, 00:23:58.508 "data_size": 63488 00:23:58.508 } 00:23:58.508 ] 00:23:58.508 }' 00:23:58.508 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:58.508 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:58.508 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:58.508 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:58.508 10:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:58.766 [2024-07-12 10:50:33.758490] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:23:59.024 [2024-07-12 10:50:34.213702] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:23:59.593 [2024-07-12 10:50:34.542060] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:23:59.593 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:59.593 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:59.593 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:59.593 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:59.593 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:59.593 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:59.593 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.593 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.593 [2024-07-12 10:50:34.744246] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:23:59.593 [2024-07-12 10:50:34.744525] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:23:59.852 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:59.852 "name": "raid_bdev1", 00:23:59.852 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:23:59.852 "strip_size_kb": 0, 00:23:59.852 "state": "online", 00:23:59.852 "raid_level": "raid1", 00:23:59.852 "superblock": true, 00:23:59.852 "num_base_bdevs": 2, 00:23:59.852 "num_base_bdevs_discovered": 2, 00:23:59.852 "num_base_bdevs_operational": 2, 00:23:59.852 "process": { 00:23:59.852 "type": "rebuild", 00:23:59.852 "target": "spare", 00:23:59.852 "progress": { 00:23:59.852 "blocks": 59392, 00:23:59.852 "percent": 93 00:23:59.852 } 00:23:59.852 }, 00:23:59.852 "base_bdevs_list": [ 00:23:59.852 { 00:23:59.852 "name": "spare", 00:23:59.852 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:23:59.852 "is_configured": true, 00:23:59.852 "data_offset": 2048, 00:23:59.852 "data_size": 63488 00:23:59.852 }, 00:23:59.852 { 00:23:59.852 "name": "BaseBdev2", 00:23:59.852 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:23:59.852 "is_configured": true, 00:23:59.852 "data_offset": 2048, 00:23:59.852 "data_size": 63488 00:23:59.852 } 00:23:59.852 ] 00:23:59.852 }' 00:23:59.852 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:59.852 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:59.852 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:59.852 [2024-07-12 10:50:34.994551] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:59.852 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:59.852 10:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:00.112 [2024-07-12 10:50:35.102779] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:00.112 [2024-07-12 10:50:35.104719] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:01.049 10:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:01.049 10:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:01.049 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:01.050 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:01.050 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:01.050 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:01.050 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.050 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:01.309 "name": "raid_bdev1", 00:24:01.309 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:01.309 "strip_size_kb": 0, 00:24:01.309 "state": "online", 00:24:01.309 "raid_level": "raid1", 00:24:01.309 "superblock": true, 00:24:01.309 "num_base_bdevs": 2, 00:24:01.309 "num_base_bdevs_discovered": 2, 00:24:01.309 "num_base_bdevs_operational": 2, 00:24:01.309 "base_bdevs_list": [ 00:24:01.309 { 00:24:01.309 "name": "spare", 00:24:01.309 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:24:01.309 "is_configured": true, 00:24:01.309 "data_offset": 2048, 00:24:01.309 "data_size": 63488 00:24:01.309 }, 00:24:01.309 { 00:24:01.309 "name": "BaseBdev2", 00:24:01.309 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:01.309 "is_configured": true, 00:24:01.309 "data_offset": 2048, 00:24:01.309 "data_size": 63488 00:24:01.309 } 00:24:01.309 ] 00:24:01.309 }' 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.309 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:01.568 "name": "raid_bdev1", 00:24:01.568 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:01.568 "strip_size_kb": 0, 00:24:01.568 "state": "online", 00:24:01.568 "raid_level": "raid1", 00:24:01.568 "superblock": true, 00:24:01.568 "num_base_bdevs": 2, 00:24:01.568 "num_base_bdevs_discovered": 2, 00:24:01.568 "num_base_bdevs_operational": 2, 00:24:01.568 "base_bdevs_list": [ 00:24:01.568 { 00:24:01.568 "name": "spare", 00:24:01.568 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:24:01.568 "is_configured": true, 00:24:01.568 "data_offset": 2048, 00:24:01.568 "data_size": 63488 00:24:01.568 }, 00:24:01.568 { 00:24:01.568 "name": "BaseBdev2", 00:24:01.568 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:01.568 "is_configured": true, 00:24:01.568 "data_offset": 2048, 00:24:01.568 "data_size": 63488 00:24:01.568 } 00:24:01.568 ] 00:24:01.568 }' 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.568 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.826 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.826 "name": "raid_bdev1", 00:24:01.826 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:01.826 "strip_size_kb": 0, 00:24:01.826 "state": "online", 00:24:01.826 "raid_level": "raid1", 00:24:01.826 "superblock": true, 00:24:01.826 "num_base_bdevs": 2, 00:24:01.826 "num_base_bdevs_discovered": 2, 00:24:01.826 "num_base_bdevs_operational": 2, 00:24:01.826 "base_bdevs_list": [ 00:24:01.826 { 00:24:01.826 "name": "spare", 00:24:01.826 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:24:01.826 "is_configured": true, 00:24:01.826 "data_offset": 2048, 00:24:01.826 "data_size": 63488 00:24:01.826 }, 00:24:01.826 { 00:24:01.826 "name": "BaseBdev2", 00:24:01.826 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:01.826 "is_configured": true, 00:24:01.826 "data_offset": 2048, 00:24:01.826 "data_size": 63488 00:24:01.826 } 00:24:01.826 ] 00:24:01.826 }' 00:24:01.826 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.826 10:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:02.422 10:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:02.681 [2024-07-12 10:50:37.760256] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:02.681 [2024-07-12 10:50:37.760289] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:02.681 00:24:02.681 Latency(us) 00:24:02.681 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:02.681 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:02.681 raid_bdev1 : 11.93 95.05 285.14 0.00 0.00 13954.11 290.28 119446.48 00:24:02.681 =================================================================================================================== 00:24:02.681 Total : 95.05 285.14 0.00 0.00 13954.11 290.28 119446.48 00:24:02.681 [2024-07-12 10:50:37.856489] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:02.681 [2024-07-12 10:50:37.856520] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:02.681 [2024-07-12 10:50:37.856593] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:02.681 [2024-07-12 10:50:37.856606] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcf7070 name raid_bdev1, state offline 00:24:02.681 0 00:24:02.939 10:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.940 10:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:02.940 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:03.198 /dev/nbd0 00:24:03.198 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:03.198 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:03.198 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:03.198 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:03.198 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:03.198 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:03.198 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:03.456 1+0 records in 00:24:03.456 1+0 records out 00:24:03.456 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283183 s, 14.5 MB/s 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:03.456 /dev/nbd1 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:03.456 1+0 records in 00:24:03.456 1+0 records out 00:24:03.456 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269794 s, 15.2 MB/s 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:03.456 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:03.457 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:03.457 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:03.457 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:03.457 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:03.457 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:03.714 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:03.714 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:03.715 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:03.715 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:03.715 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:03.715 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:03.715 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:03.972 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:03.972 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:03.972 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:03.972 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:03.973 10:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:04.230 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:04.489 [2024-07-12 10:50:39.651512] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:04.489 [2024-07-12 10:50:39.651558] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:04.489 [2024-07-12 10:50:39.651581] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb46490 00:24:04.489 [2024-07-12 10:50:39.651593] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:04.489 [2024-07-12 10:50:39.653205] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:04.489 [2024-07-12 10:50:39.653234] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:04.489 [2024-07-12 10:50:39.653312] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:04.489 [2024-07-12 10:50:39.653337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:04.489 [2024-07-12 10:50:39.653433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:04.489 spare 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.489 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.747 [2024-07-12 10:50:39.753756] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcf7490 00:24:04.747 [2024-07-12 10:50:39.753772] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:04.747 [2024-07-12 10:50:39.753956] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcf0490 00:24:04.747 [2024-07-12 10:50:39.754101] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcf7490 00:24:04.747 [2024-07-12 10:50:39.754111] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcf7490 00:24:04.747 [2024-07-12 10:50:39.754222] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:04.747 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:04.747 "name": "raid_bdev1", 00:24:04.747 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:04.747 "strip_size_kb": 0, 00:24:04.747 "state": "online", 00:24:04.747 "raid_level": "raid1", 00:24:04.747 "superblock": true, 00:24:04.747 "num_base_bdevs": 2, 00:24:04.747 "num_base_bdevs_discovered": 2, 00:24:04.747 "num_base_bdevs_operational": 2, 00:24:04.747 "base_bdevs_list": [ 00:24:04.747 { 00:24:04.747 "name": "spare", 00:24:04.747 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:24:04.747 "is_configured": true, 00:24:04.747 "data_offset": 2048, 00:24:04.747 "data_size": 63488 00:24:04.747 }, 00:24:04.747 { 00:24:04.747 "name": "BaseBdev2", 00:24:04.747 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:04.747 "is_configured": true, 00:24:04.747 "data_offset": 2048, 00:24:04.747 "data_size": 63488 00:24:04.747 } 00:24:04.747 ] 00:24:04.747 }' 00:24:04.747 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:04.747 10:50:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:05.692 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:05.692 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:05.692 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:05.692 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:05.692 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:05.692 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.692 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.692 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:05.692 "name": "raid_bdev1", 00:24:05.692 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:05.692 "strip_size_kb": 0, 00:24:05.692 "state": "online", 00:24:05.692 "raid_level": "raid1", 00:24:05.692 "superblock": true, 00:24:05.692 "num_base_bdevs": 2, 00:24:05.692 "num_base_bdevs_discovered": 2, 00:24:05.692 "num_base_bdevs_operational": 2, 00:24:05.692 "base_bdevs_list": [ 00:24:05.692 { 00:24:05.692 "name": "spare", 00:24:05.692 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:24:05.692 "is_configured": true, 00:24:05.692 "data_offset": 2048, 00:24:05.692 "data_size": 63488 00:24:05.692 }, 00:24:05.692 { 00:24:05.692 "name": "BaseBdev2", 00:24:05.692 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:05.692 "is_configured": true, 00:24:05.692 "data_offset": 2048, 00:24:05.693 "data_size": 63488 00:24:05.693 } 00:24:05.693 ] 00:24:05.693 }' 00:24:05.693 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:05.693 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:05.693 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:05.693 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:05.693 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.693 10:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:05.951 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:05.951 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:06.209 [2024-07-12 10:50:41.336316] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.209 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.467 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:06.467 "name": "raid_bdev1", 00:24:06.467 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:06.467 "strip_size_kb": 0, 00:24:06.467 "state": "online", 00:24:06.467 "raid_level": "raid1", 00:24:06.467 "superblock": true, 00:24:06.467 "num_base_bdevs": 2, 00:24:06.467 "num_base_bdevs_discovered": 1, 00:24:06.467 "num_base_bdevs_operational": 1, 00:24:06.467 "base_bdevs_list": [ 00:24:06.467 { 00:24:06.467 "name": null, 00:24:06.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.467 "is_configured": false, 00:24:06.467 "data_offset": 2048, 00:24:06.467 "data_size": 63488 00:24:06.467 }, 00:24:06.467 { 00:24:06.467 "name": "BaseBdev2", 00:24:06.467 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:06.467 "is_configured": true, 00:24:06.467 "data_offset": 2048, 00:24:06.467 "data_size": 63488 00:24:06.467 } 00:24:06.467 ] 00:24:06.467 }' 00:24:06.467 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:06.467 10:50:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:07.034 10:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:07.292 [2024-07-12 10:50:42.415342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:07.292 [2024-07-12 10:50:42.415493] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:07.292 [2024-07-12 10:50:42.415511] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:07.292 [2024-07-12 10:50:42.415538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:07.292 [2024-07-12 10:50:42.421008] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcf93d0 00:24:07.292 [2024-07-12 10:50:42.423377] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:07.292 10:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:08.665 "name": "raid_bdev1", 00:24:08.665 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:08.665 "strip_size_kb": 0, 00:24:08.665 "state": "online", 00:24:08.665 "raid_level": "raid1", 00:24:08.665 "superblock": true, 00:24:08.665 "num_base_bdevs": 2, 00:24:08.665 "num_base_bdevs_discovered": 2, 00:24:08.665 "num_base_bdevs_operational": 2, 00:24:08.665 "process": { 00:24:08.665 "type": "rebuild", 00:24:08.665 "target": "spare", 00:24:08.665 "progress": { 00:24:08.665 "blocks": 24576, 00:24:08.665 "percent": 38 00:24:08.665 } 00:24:08.665 }, 00:24:08.665 "base_bdevs_list": [ 00:24:08.665 { 00:24:08.665 "name": "spare", 00:24:08.665 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:24:08.665 "is_configured": true, 00:24:08.665 "data_offset": 2048, 00:24:08.665 "data_size": 63488 00:24:08.665 }, 00:24:08.665 { 00:24:08.665 "name": "BaseBdev2", 00:24:08.665 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:08.665 "is_configured": true, 00:24:08.665 "data_offset": 2048, 00:24:08.665 "data_size": 63488 00:24:08.665 } 00:24:08.665 ] 00:24:08.665 }' 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:08.665 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:08.666 10:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:08.924 [2024-07-12 10:50:44.003509] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:08.924 [2024-07-12 10:50:44.036422] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:08.924 [2024-07-12 10:50:44.036473] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:08.924 [2024-07-12 10:50:44.036497] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:08.924 [2024-07-12 10:50:44.036506] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.924 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.182 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:09.182 "name": "raid_bdev1", 00:24:09.182 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:09.182 "strip_size_kb": 0, 00:24:09.182 "state": "online", 00:24:09.182 "raid_level": "raid1", 00:24:09.182 "superblock": true, 00:24:09.182 "num_base_bdevs": 2, 00:24:09.182 "num_base_bdevs_discovered": 1, 00:24:09.182 "num_base_bdevs_operational": 1, 00:24:09.182 "base_bdevs_list": [ 00:24:09.182 { 00:24:09.182 "name": null, 00:24:09.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.182 "is_configured": false, 00:24:09.182 "data_offset": 2048, 00:24:09.182 "data_size": 63488 00:24:09.182 }, 00:24:09.182 { 00:24:09.182 "name": "BaseBdev2", 00:24:09.182 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:09.182 "is_configured": true, 00:24:09.182 "data_offset": 2048, 00:24:09.182 "data_size": 63488 00:24:09.182 } 00:24:09.182 ] 00:24:09.182 }' 00:24:09.182 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:09.182 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:09.749 10:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:10.008 [2024-07-12 10:50:45.124136] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:10.008 [2024-07-12 10:50:45.124182] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:10.008 [2024-07-12 10:50:45.124208] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcf7710 00:24:10.008 [2024-07-12 10:50:45.124221] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:10.008 [2024-07-12 10:50:45.124597] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:10.008 [2024-07-12 10:50:45.124617] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:10.008 [2024-07-12 10:50:45.124696] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:10.008 [2024-07-12 10:50:45.124708] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:10.008 [2024-07-12 10:50:45.124718] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:10.008 [2024-07-12 10:50:45.124737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:10.008 [2024-07-12 10:50:45.130004] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcf93d0 00:24:10.008 spare 00:24:10.008 [2024-07-12 10:50:45.131453] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:10.008 10:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.382 "name": "raid_bdev1", 00:24:11.382 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:11.382 "strip_size_kb": 0, 00:24:11.382 "state": "online", 00:24:11.382 "raid_level": "raid1", 00:24:11.382 "superblock": true, 00:24:11.382 "num_base_bdevs": 2, 00:24:11.382 "num_base_bdevs_discovered": 2, 00:24:11.382 "num_base_bdevs_operational": 2, 00:24:11.382 "process": { 00:24:11.382 "type": "rebuild", 00:24:11.382 "target": "spare", 00:24:11.382 "progress": { 00:24:11.382 "blocks": 24576, 00:24:11.382 "percent": 38 00:24:11.382 } 00:24:11.382 }, 00:24:11.382 "base_bdevs_list": [ 00:24:11.382 { 00:24:11.382 "name": "spare", 00:24:11.382 "uuid": "5eb7a94c-ecb8-5f50-b28c-a2d1008561f1", 00:24:11.382 "is_configured": true, 00:24:11.382 "data_offset": 2048, 00:24:11.382 "data_size": 63488 00:24:11.382 }, 00:24:11.382 { 00:24:11.382 "name": "BaseBdev2", 00:24:11.382 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:11.382 "is_configured": true, 00:24:11.382 "data_offset": 2048, 00:24:11.382 "data_size": 63488 00:24:11.382 } 00:24:11.382 ] 00:24:11.382 }' 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:11.382 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:11.641 [2024-07-12 10:50:46.714889] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:11.641 [2024-07-12 10:50:46.743885] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:11.641 [2024-07-12 10:50:46.743934] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:11.641 [2024-07-12 10:50:46.743949] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:11.641 [2024-07-12 10:50:46.743958] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.641 10:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.900 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:11.900 "name": "raid_bdev1", 00:24:11.900 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:11.900 "strip_size_kb": 0, 00:24:11.900 "state": "online", 00:24:11.900 "raid_level": "raid1", 00:24:11.900 "superblock": true, 00:24:11.900 "num_base_bdevs": 2, 00:24:11.900 "num_base_bdevs_discovered": 1, 00:24:11.900 "num_base_bdevs_operational": 1, 00:24:11.900 "base_bdevs_list": [ 00:24:11.900 { 00:24:11.900 "name": null, 00:24:11.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.900 "is_configured": false, 00:24:11.900 "data_offset": 2048, 00:24:11.900 "data_size": 63488 00:24:11.900 }, 00:24:11.900 { 00:24:11.900 "name": "BaseBdev2", 00:24:11.900 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:11.900 "is_configured": true, 00:24:11.900 "data_offset": 2048, 00:24:11.900 "data_size": 63488 00:24:11.900 } 00:24:11.900 ] 00:24:11.900 }' 00:24:11.900 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:11.900 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:12.466 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:12.466 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.466 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:12.466 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:12.466 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.466 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.466 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.725 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.725 "name": "raid_bdev1", 00:24:12.725 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:12.725 "strip_size_kb": 0, 00:24:12.725 "state": "online", 00:24:12.725 "raid_level": "raid1", 00:24:12.725 "superblock": true, 00:24:12.725 "num_base_bdevs": 2, 00:24:12.725 "num_base_bdevs_discovered": 1, 00:24:12.725 "num_base_bdevs_operational": 1, 00:24:12.725 "base_bdevs_list": [ 00:24:12.725 { 00:24:12.725 "name": null, 00:24:12.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:12.725 "is_configured": false, 00:24:12.725 "data_offset": 2048, 00:24:12.725 "data_size": 63488 00:24:12.725 }, 00:24:12.725 { 00:24:12.725 "name": "BaseBdev2", 00:24:12.725 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:12.725 "is_configured": true, 00:24:12.725 "data_offset": 2048, 00:24:12.725 "data_size": 63488 00:24:12.725 } 00:24:12.725 ] 00:24:12.725 }' 00:24:12.725 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.725 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:12.725 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.983 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:12.983 10:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:13.242 10:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:13.242 [2024-07-12 10:50:48.401861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:13.242 [2024-07-12 10:50:48.401910] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:13.242 [2024-07-12 10:50:48.401933] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcf4fd0 00:24:13.242 [2024-07-12 10:50:48.401947] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:13.242 [2024-07-12 10:50:48.402295] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:13.242 [2024-07-12 10:50:48.402315] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:13.242 [2024-07-12 10:50:48.402379] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:13.242 [2024-07-12 10:50:48.402391] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:13.242 [2024-07-12 10:50:48.402401] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:13.242 BaseBdev1 00:24:13.242 10:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:14.618 "name": "raid_bdev1", 00:24:14.618 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:14.618 "strip_size_kb": 0, 00:24:14.618 "state": "online", 00:24:14.618 "raid_level": "raid1", 00:24:14.618 "superblock": true, 00:24:14.618 "num_base_bdevs": 2, 00:24:14.618 "num_base_bdevs_discovered": 1, 00:24:14.618 "num_base_bdevs_operational": 1, 00:24:14.618 "base_bdevs_list": [ 00:24:14.618 { 00:24:14.618 "name": null, 00:24:14.618 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:14.618 "is_configured": false, 00:24:14.618 "data_offset": 2048, 00:24:14.618 "data_size": 63488 00:24:14.618 }, 00:24:14.618 { 00:24:14.618 "name": "BaseBdev2", 00:24:14.618 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:14.618 "is_configured": true, 00:24:14.618 "data_offset": 2048, 00:24:14.618 "data_size": 63488 00:24:14.618 } 00:24:14.618 ] 00:24:14.618 }' 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:14.618 10:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:15.183 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:15.183 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:15.183 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:15.183 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:15.183 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:15.183 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.183 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.440 "name": "raid_bdev1", 00:24:15.440 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:15.440 "strip_size_kb": 0, 00:24:15.440 "state": "online", 00:24:15.440 "raid_level": "raid1", 00:24:15.440 "superblock": true, 00:24:15.440 "num_base_bdevs": 2, 00:24:15.440 "num_base_bdevs_discovered": 1, 00:24:15.440 "num_base_bdevs_operational": 1, 00:24:15.440 "base_bdevs_list": [ 00:24:15.440 { 00:24:15.440 "name": null, 00:24:15.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:15.440 "is_configured": false, 00:24:15.440 "data_offset": 2048, 00:24:15.440 "data_size": 63488 00:24:15.440 }, 00:24:15.440 { 00:24:15.440 "name": "BaseBdev2", 00:24:15.440 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:15.440 "is_configured": true, 00:24:15.440 "data_offset": 2048, 00:24:15.440 "data_size": 63488 00:24:15.440 } 00:24:15.440 ] 00:24:15.440 }' 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:15.440 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:15.697 [2024-07-12 10:50:50.852955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:15.697 [2024-07-12 10:50:50.853074] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:15.697 [2024-07-12 10:50:50.853090] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:15.697 request: 00:24:15.697 { 00:24:15.697 "base_bdev": "BaseBdev1", 00:24:15.697 "raid_bdev": "raid_bdev1", 00:24:15.697 "method": "bdev_raid_add_base_bdev", 00:24:15.697 "req_id": 1 00:24:15.697 } 00:24:15.697 Got JSON-RPC error response 00:24:15.697 response: 00:24:15.697 { 00:24:15.697 "code": -22, 00:24:15.697 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:15.697 } 00:24:15.697 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:24:15.697 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:15.697 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:15.697 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:15.697 10:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.069 10:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.069 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.069 "name": "raid_bdev1", 00:24:17.069 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:17.069 "strip_size_kb": 0, 00:24:17.069 "state": "online", 00:24:17.069 "raid_level": "raid1", 00:24:17.069 "superblock": true, 00:24:17.069 "num_base_bdevs": 2, 00:24:17.069 "num_base_bdevs_discovered": 1, 00:24:17.069 "num_base_bdevs_operational": 1, 00:24:17.069 "base_bdevs_list": [ 00:24:17.069 { 00:24:17.069 "name": null, 00:24:17.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.069 "is_configured": false, 00:24:17.069 "data_offset": 2048, 00:24:17.069 "data_size": 63488 00:24:17.069 }, 00:24:17.069 { 00:24:17.069 "name": "BaseBdev2", 00:24:17.069 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:17.069 "is_configured": true, 00:24:17.069 "data_offset": 2048, 00:24:17.069 "data_size": 63488 00:24:17.069 } 00:24:17.069 ] 00:24:17.069 }' 00:24:17.069 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.069 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:17.672 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:17.672 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:17.672 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:17.672 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:17.672 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:17.672 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.672 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:17.944 "name": "raid_bdev1", 00:24:17.944 "uuid": "0d862e1a-cfee-42e3-9531-f098a931fd55", 00:24:17.944 "strip_size_kb": 0, 00:24:17.944 "state": "online", 00:24:17.944 "raid_level": "raid1", 00:24:17.944 "superblock": true, 00:24:17.944 "num_base_bdevs": 2, 00:24:17.944 "num_base_bdevs_discovered": 1, 00:24:17.944 "num_base_bdevs_operational": 1, 00:24:17.944 "base_bdevs_list": [ 00:24:17.944 { 00:24:17.944 "name": null, 00:24:17.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.944 "is_configured": false, 00:24:17.944 "data_offset": 2048, 00:24:17.944 "data_size": 63488 00:24:17.944 }, 00:24:17.944 { 00:24:17.944 "name": "BaseBdev2", 00:24:17.944 "uuid": "cd7cc9f1-1d17-5806-93eb-4a18b80e98fd", 00:24:17.944 "is_configured": true, 00:24:17.944 "data_offset": 2048, 00:24:17.944 "data_size": 63488 00:24:17.944 } 00:24:17.944 ] 00:24:17.944 }' 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2131996 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2131996 ']' 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2131996 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:17.944 10:50:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2131996 00:24:17.944 10:50:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:17.944 10:50:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:17.944 10:50:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2131996' 00:24:17.944 killing process with pid 2131996 00:24:17.944 10:50:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2131996 00:24:17.944 Received shutdown signal, test time was about 27.058324 seconds 00:24:17.944 00:24:17.944 Latency(us) 00:24:17.944 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:17.944 =================================================================================================================== 00:24:17.944 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:17.944 [2024-07-12 10:50:53.016709] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:17.944 [2024-07-12 10:50:53.016800] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:17.944 [2024-07-12 10:50:53.016847] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:17.944 [2024-07-12 10:50:53.016859] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcf7490 name raid_bdev1, state offline 00:24:17.944 10:50:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2131996 00:24:17.944 [2024-07-12 10:50:53.037812] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:18.202 00:24:18.202 real 0m31.472s 00:24:18.202 user 0m49.717s 00:24:18.202 sys 0m4.587s 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:18.202 ************************************ 00:24:18.202 END TEST raid_rebuild_test_sb_io 00:24:18.202 ************************************ 00:24:18.202 10:50:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:18.202 10:50:53 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:24:18.202 10:50:53 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:24:18.202 10:50:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:18.202 10:50:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:18.202 10:50:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:18.202 ************************************ 00:24:18.202 START TEST raid_rebuild_test 00:24:18.202 ************************************ 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2136506 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2136506 /var/tmp/spdk-raid.sock 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2136506 ']' 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:18.202 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:18.202 10:50:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:18.459 [2024-07-12 10:50:53.406224] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:24:18.459 [2024-07-12 10:50:53.406290] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2136506 ] 00:24:18.459 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:18.459 Zero copy mechanism will not be used. 00:24:18.459 [2024-07-12 10:50:53.533546] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:18.459 [2024-07-12 10:50:53.639267] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:18.716 [2024-07-12 10:50:53.707808] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:18.716 [2024-07-12 10:50:53.707844] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:19.280 10:50:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:19.280 10:50:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:24:19.280 10:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:19.280 10:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:19.537 BaseBdev1_malloc 00:24:19.537 10:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:19.794 [2024-07-12 10:50:54.814153] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:19.794 [2024-07-12 10:50:54.814201] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:19.794 [2024-07-12 10:50:54.814224] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1650d40 00:24:19.794 [2024-07-12 10:50:54.814237] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:19.794 [2024-07-12 10:50:54.815954] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:19.794 [2024-07-12 10:50:54.815985] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:19.794 BaseBdev1 00:24:19.794 10:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:19.794 10:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:20.052 BaseBdev2_malloc 00:24:20.052 10:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:20.309 [2024-07-12 10:50:55.301607] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:20.309 [2024-07-12 10:50:55.301654] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.309 [2024-07-12 10:50:55.301679] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1651860 00:24:20.309 [2024-07-12 10:50:55.301691] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.310 [2024-07-12 10:50:55.303264] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.310 [2024-07-12 10:50:55.303295] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:20.310 BaseBdev2 00:24:20.310 10:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:20.310 10:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:20.568 BaseBdev3_malloc 00:24:20.568 10:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:20.825 [2024-07-12 10:50:55.808760] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:20.825 [2024-07-12 10:50:55.808806] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.825 [2024-07-12 10:50:55.808828] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17fe8f0 00:24:20.825 [2024-07-12 10:50:55.808840] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.825 [2024-07-12 10:50:55.810414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.825 [2024-07-12 10:50:55.810443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:20.825 BaseBdev3 00:24:20.825 10:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:20.825 10:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:21.083 BaseBdev4_malloc 00:24:21.083 10:50:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:21.341 [2024-07-12 10:50:56.302742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:21.341 [2024-07-12 10:50:56.302788] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:21.341 [2024-07-12 10:50:56.302810] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17fdad0 00:24:21.341 [2024-07-12 10:50:56.302823] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:21.341 [2024-07-12 10:50:56.304341] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:21.341 [2024-07-12 10:50:56.304370] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:21.341 BaseBdev4 00:24:21.341 10:50:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:21.599 spare_malloc 00:24:21.599 10:50:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:21.856 spare_delay 00:24:21.856 10:50:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:21.857 [2024-07-12 10:50:57.029263] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:21.857 [2024-07-12 10:50:57.029311] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:21.857 [2024-07-12 10:50:57.029333] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18025b0 00:24:21.884 [2024-07-12 10:50:57.029346] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:21.884 [2024-07-12 10:50:57.030982] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:21.884 [2024-07-12 10:50:57.031012] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:21.884 spare 00:24:21.884 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:22.141 [2024-07-12 10:50:57.265907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:22.141 [2024-07-12 10:50:57.267253] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:22.141 [2024-07-12 10:50:57.267307] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:22.141 [2024-07-12 10:50:57.267353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:22.141 [2024-07-12 10:50:57.267436] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17818a0 00:24:22.141 [2024-07-12 10:50:57.267452] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:22.141 [2024-07-12 10:50:57.267678] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17fbe10 00:24:22.141 [2024-07-12 10:50:57.267826] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17818a0 00:24:22.141 [2024-07-12 10:50:57.267837] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17818a0 00:24:22.141 [2024-07-12 10:50:57.267952] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.141 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.399 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:22.399 "name": "raid_bdev1", 00:24:22.399 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:22.399 "strip_size_kb": 0, 00:24:22.399 "state": "online", 00:24:22.399 "raid_level": "raid1", 00:24:22.399 "superblock": false, 00:24:22.399 "num_base_bdevs": 4, 00:24:22.399 "num_base_bdevs_discovered": 4, 00:24:22.399 "num_base_bdevs_operational": 4, 00:24:22.399 "base_bdevs_list": [ 00:24:22.399 { 00:24:22.399 "name": "BaseBdev1", 00:24:22.399 "uuid": "a6c976fc-ba1d-5330-9e3e-fd2e5c753062", 00:24:22.399 "is_configured": true, 00:24:22.399 "data_offset": 0, 00:24:22.399 "data_size": 65536 00:24:22.399 }, 00:24:22.399 { 00:24:22.399 "name": "BaseBdev2", 00:24:22.399 "uuid": "bc5712a4-8019-54d2-b729-a01d2af6649b", 00:24:22.399 "is_configured": true, 00:24:22.399 "data_offset": 0, 00:24:22.399 "data_size": 65536 00:24:22.399 }, 00:24:22.399 { 00:24:22.399 "name": "BaseBdev3", 00:24:22.399 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:22.399 "is_configured": true, 00:24:22.399 "data_offset": 0, 00:24:22.399 "data_size": 65536 00:24:22.399 }, 00:24:22.399 { 00:24:22.399 "name": "BaseBdev4", 00:24:22.399 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:22.399 "is_configured": true, 00:24:22.399 "data_offset": 0, 00:24:22.399 "data_size": 65536 00:24:22.399 } 00:24:22.399 ] 00:24:22.399 }' 00:24:22.399 10:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:22.399 10:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:22.963 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:22.963 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:23.220 [2024-07-12 10:50:58.337033] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:23.220 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:23.220 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.220 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:23.477 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:23.734 [2024-07-12 10:50:58.838100] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17fbe10 00:24:23.734 /dev/nbd0 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:23.734 1+0 records in 00:24:23.734 1+0 records out 00:24:23.734 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263128 s, 15.6 MB/s 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:23.734 10:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:31.842 65536+0 records in 00:24:31.842 65536+0 records out 00:24:31.842 33554432 bytes (34 MB, 32 MiB) copied, 6.79114 s, 4.9 MB/s 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:31.842 [2024-07-12 10:51:05.961162] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:31.842 10:51:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:31.842 [2024-07-12 10:51:06.197847] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:31.842 "name": "raid_bdev1", 00:24:31.842 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:31.842 "strip_size_kb": 0, 00:24:31.842 "state": "online", 00:24:31.842 "raid_level": "raid1", 00:24:31.842 "superblock": false, 00:24:31.842 "num_base_bdevs": 4, 00:24:31.842 "num_base_bdevs_discovered": 3, 00:24:31.842 "num_base_bdevs_operational": 3, 00:24:31.842 "base_bdevs_list": [ 00:24:31.842 { 00:24:31.842 "name": null, 00:24:31.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.842 "is_configured": false, 00:24:31.842 "data_offset": 0, 00:24:31.842 "data_size": 65536 00:24:31.842 }, 00:24:31.842 { 00:24:31.842 "name": "BaseBdev2", 00:24:31.842 "uuid": "bc5712a4-8019-54d2-b729-a01d2af6649b", 00:24:31.842 "is_configured": true, 00:24:31.842 "data_offset": 0, 00:24:31.842 "data_size": 65536 00:24:31.842 }, 00:24:31.842 { 00:24:31.842 "name": "BaseBdev3", 00:24:31.842 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:31.842 "is_configured": true, 00:24:31.842 "data_offset": 0, 00:24:31.842 "data_size": 65536 00:24:31.842 }, 00:24:31.842 { 00:24:31.842 "name": "BaseBdev4", 00:24:31.842 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:31.842 "is_configured": true, 00:24:31.842 "data_offset": 0, 00:24:31.842 "data_size": 65536 00:24:31.842 } 00:24:31.842 ] 00:24:31.842 }' 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:31.842 10:51:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:32.101 10:51:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:32.101 [2024-07-12 10:51:07.220555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:32.101 [2024-07-12 10:51:07.224676] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17876b0 00:24:32.101 [2024-07-12 10:51:07.227058] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:32.101 10:51:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:33.478 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:33.478 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.478 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:33.478 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:33.478 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.478 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.478 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.478 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.478 "name": "raid_bdev1", 00:24:33.478 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:33.478 "strip_size_kb": 0, 00:24:33.478 "state": "online", 00:24:33.478 "raid_level": "raid1", 00:24:33.478 "superblock": false, 00:24:33.478 "num_base_bdevs": 4, 00:24:33.478 "num_base_bdevs_discovered": 4, 00:24:33.478 "num_base_bdevs_operational": 4, 00:24:33.478 "process": { 00:24:33.478 "type": "rebuild", 00:24:33.478 "target": "spare", 00:24:33.478 "progress": { 00:24:33.478 "blocks": 24576, 00:24:33.478 "percent": 37 00:24:33.478 } 00:24:33.478 }, 00:24:33.478 "base_bdevs_list": [ 00:24:33.478 { 00:24:33.478 "name": "spare", 00:24:33.478 "uuid": "bb4f7b61-6041-5f07-a0e3-4fbfd2c6bb3c", 00:24:33.478 "is_configured": true, 00:24:33.478 "data_offset": 0, 00:24:33.478 "data_size": 65536 00:24:33.478 }, 00:24:33.478 { 00:24:33.478 "name": "BaseBdev2", 00:24:33.478 "uuid": "bc5712a4-8019-54d2-b729-a01d2af6649b", 00:24:33.478 "is_configured": true, 00:24:33.478 "data_offset": 0, 00:24:33.478 "data_size": 65536 00:24:33.478 }, 00:24:33.478 { 00:24:33.478 "name": "BaseBdev3", 00:24:33.478 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:33.478 "is_configured": true, 00:24:33.478 "data_offset": 0, 00:24:33.478 "data_size": 65536 00:24:33.478 }, 00:24:33.478 { 00:24:33.478 "name": "BaseBdev4", 00:24:33.478 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:33.478 "is_configured": true, 00:24:33.478 "data_offset": 0, 00:24:33.478 "data_size": 65536 00:24:33.478 } 00:24:33.478 ] 00:24:33.478 }' 00:24:33.479 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.479 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:33.479 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.479 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:33.479 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:33.737 [2024-07-12 10:51:08.814273] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:33.737 [2024-07-12 10:51:08.839797] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:33.737 [2024-07-12 10:51:08.839843] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.738 [2024-07-12 10:51:08.839860] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:33.738 [2024-07-12 10:51:08.839868] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.738 10:51:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.997 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.997 "name": "raid_bdev1", 00:24:33.997 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:33.997 "strip_size_kb": 0, 00:24:33.997 "state": "online", 00:24:33.997 "raid_level": "raid1", 00:24:33.997 "superblock": false, 00:24:33.997 "num_base_bdevs": 4, 00:24:33.997 "num_base_bdevs_discovered": 3, 00:24:33.997 "num_base_bdevs_operational": 3, 00:24:33.997 "base_bdevs_list": [ 00:24:33.997 { 00:24:33.997 "name": null, 00:24:33.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.997 "is_configured": false, 00:24:33.997 "data_offset": 0, 00:24:33.997 "data_size": 65536 00:24:33.997 }, 00:24:33.997 { 00:24:33.997 "name": "BaseBdev2", 00:24:33.997 "uuid": "bc5712a4-8019-54d2-b729-a01d2af6649b", 00:24:33.997 "is_configured": true, 00:24:33.997 "data_offset": 0, 00:24:33.997 "data_size": 65536 00:24:33.997 }, 00:24:33.997 { 00:24:33.997 "name": "BaseBdev3", 00:24:33.997 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:33.997 "is_configured": true, 00:24:33.997 "data_offset": 0, 00:24:33.997 "data_size": 65536 00:24:33.997 }, 00:24:33.997 { 00:24:33.997 "name": "BaseBdev4", 00:24:33.997 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:33.997 "is_configured": true, 00:24:33.997 "data_offset": 0, 00:24:33.997 "data_size": 65536 00:24:33.997 } 00:24:33.997 ] 00:24:33.997 }' 00:24:33.997 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.997 10:51:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:34.564 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:34.564 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:34.564 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:34.564 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:34.564 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:34.564 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.564 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.822 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:34.822 "name": "raid_bdev1", 00:24:34.822 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:34.822 "strip_size_kb": 0, 00:24:34.822 "state": "online", 00:24:34.822 "raid_level": "raid1", 00:24:34.822 "superblock": false, 00:24:34.822 "num_base_bdevs": 4, 00:24:34.822 "num_base_bdevs_discovered": 3, 00:24:34.822 "num_base_bdevs_operational": 3, 00:24:34.822 "base_bdevs_list": [ 00:24:34.822 { 00:24:34.822 "name": null, 00:24:34.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.822 "is_configured": false, 00:24:34.822 "data_offset": 0, 00:24:34.822 "data_size": 65536 00:24:34.822 }, 00:24:34.822 { 00:24:34.822 "name": "BaseBdev2", 00:24:34.822 "uuid": "bc5712a4-8019-54d2-b729-a01d2af6649b", 00:24:34.822 "is_configured": true, 00:24:34.822 "data_offset": 0, 00:24:34.822 "data_size": 65536 00:24:34.822 }, 00:24:34.822 { 00:24:34.822 "name": "BaseBdev3", 00:24:34.822 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:34.822 "is_configured": true, 00:24:34.822 "data_offset": 0, 00:24:34.822 "data_size": 65536 00:24:34.822 }, 00:24:34.822 { 00:24:34.822 "name": "BaseBdev4", 00:24:34.822 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:34.822 "is_configured": true, 00:24:34.822 "data_offset": 0, 00:24:34.822 "data_size": 65536 00:24:34.822 } 00:24:34.822 ] 00:24:34.822 }' 00:24:34.822 10:51:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:34.822 10:51:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:34.822 10:51:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:35.081 10:51:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:35.081 10:51:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:35.081 [2024-07-12 10:51:10.220142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:35.081 [2024-07-12 10:51:10.224232] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17876b0 00:24:35.081 [2024-07-12 10:51:10.225741] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:35.081 10:51:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:36.461 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:36.461 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:36.461 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:36.461 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:36.461 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:36.461 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:36.462 "name": "raid_bdev1", 00:24:36.462 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:36.462 "strip_size_kb": 0, 00:24:36.462 "state": "online", 00:24:36.462 "raid_level": "raid1", 00:24:36.462 "superblock": false, 00:24:36.462 "num_base_bdevs": 4, 00:24:36.462 "num_base_bdevs_discovered": 4, 00:24:36.462 "num_base_bdevs_operational": 4, 00:24:36.462 "process": { 00:24:36.462 "type": "rebuild", 00:24:36.462 "target": "spare", 00:24:36.462 "progress": { 00:24:36.462 "blocks": 22528, 00:24:36.462 "percent": 34 00:24:36.462 } 00:24:36.462 }, 00:24:36.462 "base_bdevs_list": [ 00:24:36.462 { 00:24:36.462 "name": "spare", 00:24:36.462 "uuid": "bb4f7b61-6041-5f07-a0e3-4fbfd2c6bb3c", 00:24:36.462 "is_configured": true, 00:24:36.462 "data_offset": 0, 00:24:36.462 "data_size": 65536 00:24:36.462 }, 00:24:36.462 { 00:24:36.462 "name": "BaseBdev2", 00:24:36.462 "uuid": "bc5712a4-8019-54d2-b729-a01d2af6649b", 00:24:36.462 "is_configured": true, 00:24:36.462 "data_offset": 0, 00:24:36.462 "data_size": 65536 00:24:36.462 }, 00:24:36.462 { 00:24:36.462 "name": "BaseBdev3", 00:24:36.462 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:36.462 "is_configured": true, 00:24:36.462 "data_offset": 0, 00:24:36.462 "data_size": 65536 00:24:36.462 }, 00:24:36.462 { 00:24:36.462 "name": "BaseBdev4", 00:24:36.462 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:36.462 "is_configured": true, 00:24:36.462 "data_offset": 0, 00:24:36.462 "data_size": 65536 00:24:36.462 } 00:24:36.462 ] 00:24:36.462 }' 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:36.462 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:36.721 [2024-07-12 10:51:11.757174] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:36.721 [2024-07-12 10:51:11.838353] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x17876b0 00:24:36.721 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:36.721 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:36.721 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:36.721 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:36.721 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:36.721 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:36.721 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:36.721 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.721 10:51:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.980 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:36.980 "name": "raid_bdev1", 00:24:36.980 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:36.980 "strip_size_kb": 0, 00:24:36.980 "state": "online", 00:24:36.980 "raid_level": "raid1", 00:24:36.980 "superblock": false, 00:24:36.980 "num_base_bdevs": 4, 00:24:36.980 "num_base_bdevs_discovered": 3, 00:24:36.980 "num_base_bdevs_operational": 3, 00:24:36.980 "process": { 00:24:36.980 "type": "rebuild", 00:24:36.980 "target": "spare", 00:24:36.980 "progress": { 00:24:36.980 "blocks": 36864, 00:24:36.980 "percent": 56 00:24:36.980 } 00:24:36.980 }, 00:24:36.980 "base_bdevs_list": [ 00:24:36.980 { 00:24:36.980 "name": "spare", 00:24:36.980 "uuid": "bb4f7b61-6041-5f07-a0e3-4fbfd2c6bb3c", 00:24:36.980 "is_configured": true, 00:24:36.980 "data_offset": 0, 00:24:36.980 "data_size": 65536 00:24:36.980 }, 00:24:36.980 { 00:24:36.980 "name": null, 00:24:36.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.980 "is_configured": false, 00:24:36.980 "data_offset": 0, 00:24:36.980 "data_size": 65536 00:24:36.980 }, 00:24:36.980 { 00:24:36.980 "name": "BaseBdev3", 00:24:36.980 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:36.980 "is_configured": true, 00:24:36.980 "data_offset": 0, 00:24:36.980 "data_size": 65536 00:24:36.980 }, 00:24:36.980 { 00:24:36.980 "name": "BaseBdev4", 00:24:36.980 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:36.980 "is_configured": true, 00:24:36.980 "data_offset": 0, 00:24:36.980 "data_size": 65536 00:24:36.980 } 00:24:36.980 ] 00:24:36.980 }' 00:24:36.980 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:36.980 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:36.980 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=868 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:37.279 "name": "raid_bdev1", 00:24:37.279 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:37.279 "strip_size_kb": 0, 00:24:37.279 "state": "online", 00:24:37.279 "raid_level": "raid1", 00:24:37.279 "superblock": false, 00:24:37.279 "num_base_bdevs": 4, 00:24:37.279 "num_base_bdevs_discovered": 3, 00:24:37.279 "num_base_bdevs_operational": 3, 00:24:37.279 "process": { 00:24:37.279 "type": "rebuild", 00:24:37.279 "target": "spare", 00:24:37.279 "progress": { 00:24:37.279 "blocks": 43008, 00:24:37.279 "percent": 65 00:24:37.279 } 00:24:37.279 }, 00:24:37.279 "base_bdevs_list": [ 00:24:37.279 { 00:24:37.279 "name": "spare", 00:24:37.279 "uuid": "bb4f7b61-6041-5f07-a0e3-4fbfd2c6bb3c", 00:24:37.279 "is_configured": true, 00:24:37.279 "data_offset": 0, 00:24:37.279 "data_size": 65536 00:24:37.279 }, 00:24:37.279 { 00:24:37.279 "name": null, 00:24:37.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:37.279 "is_configured": false, 00:24:37.279 "data_offset": 0, 00:24:37.279 "data_size": 65536 00:24:37.279 }, 00:24:37.279 { 00:24:37.279 "name": "BaseBdev3", 00:24:37.279 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:37.279 "is_configured": true, 00:24:37.279 "data_offset": 0, 00:24:37.279 "data_size": 65536 00:24:37.279 }, 00:24:37.279 { 00:24:37.279 "name": "BaseBdev4", 00:24:37.279 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:37.279 "is_configured": true, 00:24:37.279 "data_offset": 0, 00:24:37.279 "data_size": 65536 00:24:37.279 } 00:24:37.279 ] 00:24:37.279 }' 00:24:37.279 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:37.538 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:37.538 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:37.538 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:37.538 10:51:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:38.473 [2024-07-12 10:51:13.450986] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:38.473 [2024-07-12 10:51:13.451046] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:38.473 [2024-07-12 10:51:13.451083] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:38.473 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:38.473 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:38.473 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.473 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:38.473 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:38.473 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.473 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.473 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.732 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.732 "name": "raid_bdev1", 00:24:38.732 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:38.732 "strip_size_kb": 0, 00:24:38.732 "state": "online", 00:24:38.732 "raid_level": "raid1", 00:24:38.732 "superblock": false, 00:24:38.732 "num_base_bdevs": 4, 00:24:38.732 "num_base_bdevs_discovered": 3, 00:24:38.732 "num_base_bdevs_operational": 3, 00:24:38.732 "base_bdevs_list": [ 00:24:38.732 { 00:24:38.732 "name": "spare", 00:24:38.732 "uuid": "bb4f7b61-6041-5f07-a0e3-4fbfd2c6bb3c", 00:24:38.732 "is_configured": true, 00:24:38.732 "data_offset": 0, 00:24:38.732 "data_size": 65536 00:24:38.732 }, 00:24:38.732 { 00:24:38.732 "name": null, 00:24:38.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.732 "is_configured": false, 00:24:38.732 "data_offset": 0, 00:24:38.732 "data_size": 65536 00:24:38.732 }, 00:24:38.732 { 00:24:38.732 "name": "BaseBdev3", 00:24:38.732 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:38.732 "is_configured": true, 00:24:38.732 "data_offset": 0, 00:24:38.732 "data_size": 65536 00:24:38.732 }, 00:24:38.732 { 00:24:38.732 "name": "BaseBdev4", 00:24:38.732 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:38.732 "is_configured": true, 00:24:38.732 "data_offset": 0, 00:24:38.732 "data_size": 65536 00:24:38.732 } 00:24:38.732 ] 00:24:38.732 }' 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.733 10:51:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.991 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.991 "name": "raid_bdev1", 00:24:38.991 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:38.991 "strip_size_kb": 0, 00:24:38.991 "state": "online", 00:24:38.991 "raid_level": "raid1", 00:24:38.991 "superblock": false, 00:24:38.991 "num_base_bdevs": 4, 00:24:38.991 "num_base_bdevs_discovered": 3, 00:24:38.991 "num_base_bdevs_operational": 3, 00:24:38.991 "base_bdevs_list": [ 00:24:38.991 { 00:24:38.991 "name": "spare", 00:24:38.991 "uuid": "bb4f7b61-6041-5f07-a0e3-4fbfd2c6bb3c", 00:24:38.991 "is_configured": true, 00:24:38.991 "data_offset": 0, 00:24:38.991 "data_size": 65536 00:24:38.991 }, 00:24:38.991 { 00:24:38.991 "name": null, 00:24:38.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.991 "is_configured": false, 00:24:38.991 "data_offset": 0, 00:24:38.991 "data_size": 65536 00:24:38.991 }, 00:24:38.991 { 00:24:38.991 "name": "BaseBdev3", 00:24:38.991 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:38.991 "is_configured": true, 00:24:38.991 "data_offset": 0, 00:24:38.991 "data_size": 65536 00:24:38.991 }, 00:24:38.991 { 00:24:38.991 "name": "BaseBdev4", 00:24:38.991 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:38.991 "is_configured": true, 00:24:38.991 "data_offset": 0, 00:24:38.991 "data_size": 65536 00:24:38.991 } 00:24:38.991 ] 00:24:38.991 }' 00:24:38.991 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.992 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:38.992 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:39.250 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:39.250 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:39.250 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.251 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.510 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.510 "name": "raid_bdev1", 00:24:39.510 "uuid": "862d6fa2-33b2-459d-86c2-33ec66d7d136", 00:24:39.510 "strip_size_kb": 0, 00:24:39.510 "state": "online", 00:24:39.510 "raid_level": "raid1", 00:24:39.510 "superblock": false, 00:24:39.510 "num_base_bdevs": 4, 00:24:39.510 "num_base_bdevs_discovered": 3, 00:24:39.510 "num_base_bdevs_operational": 3, 00:24:39.510 "base_bdevs_list": [ 00:24:39.510 { 00:24:39.510 "name": "spare", 00:24:39.510 "uuid": "bb4f7b61-6041-5f07-a0e3-4fbfd2c6bb3c", 00:24:39.510 "is_configured": true, 00:24:39.510 "data_offset": 0, 00:24:39.510 "data_size": 65536 00:24:39.510 }, 00:24:39.510 { 00:24:39.510 "name": null, 00:24:39.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.510 "is_configured": false, 00:24:39.510 "data_offset": 0, 00:24:39.510 "data_size": 65536 00:24:39.510 }, 00:24:39.510 { 00:24:39.510 "name": "BaseBdev3", 00:24:39.510 "uuid": "376743e6-0b74-5ecf-97a5-cf6c1ac8c5d1", 00:24:39.510 "is_configured": true, 00:24:39.510 "data_offset": 0, 00:24:39.510 "data_size": 65536 00:24:39.510 }, 00:24:39.510 { 00:24:39.510 "name": "BaseBdev4", 00:24:39.510 "uuid": "354fd27f-c898-5ec4-bfc1-e1b5af688247", 00:24:39.510 "is_configured": true, 00:24:39.510 "data_offset": 0, 00:24:39.510 "data_size": 65536 00:24:39.510 } 00:24:39.510 ] 00:24:39.510 }' 00:24:39.510 10:51:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.510 10:51:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:40.078 10:51:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:40.337 [2024-07-12 10:51:15.311733] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:40.337 [2024-07-12 10:51:15.311759] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:40.337 [2024-07-12 10:51:15.311813] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:40.337 [2024-07-12 10:51:15.311880] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:40.337 [2024-07-12 10:51:15.311891] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17818a0 name raid_bdev1, state offline 00:24:40.337 10:51:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.337 10:51:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:40.596 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:40.855 /dev/nbd0 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:40.855 1+0 records in 00:24:40.855 1+0 records out 00:24:40.855 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241741 s, 16.9 MB/s 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:40.855 10:51:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:41.114 /dev/nbd1 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:41.114 1+0 records in 00:24:41.114 1+0 records out 00:24:41.114 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313282 s, 13.1 MB/s 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:41.114 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:41.373 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2136506 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2136506 ']' 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2136506 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:41.632 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2136506 00:24:41.891 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:41.891 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:41.891 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2136506' 00:24:41.891 killing process with pid 2136506 00:24:41.891 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2136506 00:24:41.891 Received shutdown signal, test time was about 60.000000 seconds 00:24:41.891 00:24:41.891 Latency(us) 00:24:41.891 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:41.891 =================================================================================================================== 00:24:41.891 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:41.891 [2024-07-12 10:51:16.850319] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:41.891 10:51:16 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2136506 00:24:41.891 [2024-07-12 10:51:16.900164] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:24:42.150 00:24:42.150 real 0m23.787s 00:24:42.150 user 0m32.504s 00:24:42.150 sys 0m5.105s 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:42.150 ************************************ 00:24:42.150 END TEST raid_rebuild_test 00:24:42.150 ************************************ 00:24:42.150 10:51:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:42.150 10:51:17 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:24:42.150 10:51:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:42.150 10:51:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:42.150 10:51:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:42.150 ************************************ 00:24:42.150 START TEST raid_rebuild_test_sb 00:24:42.150 ************************************ 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2140399 00:24:42.150 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2140399 /var/tmp/spdk-raid.sock 00:24:42.151 10:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:42.151 10:51:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2140399 ']' 00:24:42.151 10:51:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:42.151 10:51:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:42.151 10:51:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:42.151 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:42.151 10:51:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:42.151 10:51:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:42.151 [2024-07-12 10:51:17.282257] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:24:42.151 [2024-07-12 10:51:17.282328] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2140399 ] 00:24:42.151 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:42.151 Zero copy mechanism will not be used. 00:24:42.410 [2024-07-12 10:51:17.413124] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:42.410 [2024-07-12 10:51:17.517817] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:42.410 [2024-07-12 10:51:17.581071] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:42.410 [2024-07-12 10:51:17.581103] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:43.347 10:51:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:43.347 10:51:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:24:43.347 10:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:43.347 10:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:43.347 BaseBdev1_malloc 00:24:43.347 10:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:43.606 [2024-07-12 10:51:18.625516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:43.606 [2024-07-12 10:51:18.625564] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:43.606 [2024-07-12 10:51:18.625586] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b5bd40 00:24:43.606 [2024-07-12 10:51:18.625599] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:43.606 [2024-07-12 10:51:18.627163] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:43.606 [2024-07-12 10:51:18.627190] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:43.606 BaseBdev1 00:24:43.606 10:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:43.606 10:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:43.864 BaseBdev2_malloc 00:24:43.864 10:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:44.123 [2024-07-12 10:51:19.119818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:44.123 [2024-07-12 10:51:19.119869] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:44.123 [2024-07-12 10:51:19.119895] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b5c860 00:24:44.123 [2024-07-12 10:51:19.119908] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:44.123 [2024-07-12 10:51:19.121374] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:44.123 [2024-07-12 10:51:19.121401] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:44.123 BaseBdev2 00:24:44.123 10:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:44.123 10:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:44.381 BaseBdev3_malloc 00:24:44.381 10:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:44.639 [2024-07-12 10:51:19.613782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:44.639 [2024-07-12 10:51:19.613830] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:44.639 [2024-07-12 10:51:19.613853] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d098f0 00:24:44.639 [2024-07-12 10:51:19.613866] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:44.639 [2024-07-12 10:51:19.615291] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:44.639 [2024-07-12 10:51:19.615319] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:44.639 BaseBdev3 00:24:44.639 10:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:44.639 10:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:44.897 BaseBdev4_malloc 00:24:44.897 10:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:45.154 [2024-07-12 10:51:20.119786] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:45.154 [2024-07-12 10:51:20.119838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:45.154 [2024-07-12 10:51:20.119860] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d08ad0 00:24:45.154 [2024-07-12 10:51:20.119873] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:45.154 [2024-07-12 10:51:20.121361] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:45.154 [2024-07-12 10:51:20.121388] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:45.154 BaseBdev4 00:24:45.155 10:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:45.412 spare_malloc 00:24:45.412 10:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:45.671 spare_delay 00:24:45.671 10:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:45.671 [2024-07-12 10:51:20.862312] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:45.671 [2024-07-12 10:51:20.862361] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:45.671 [2024-07-12 10:51:20.862383] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d0d5b0 00:24:45.671 [2024-07-12 10:51:20.862395] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:45.671 [2024-07-12 10:51:20.863922] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:45.671 [2024-07-12 10:51:20.863953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:45.930 spare 00:24:45.930 10:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:45.930 [2024-07-12 10:51:21.111007] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:45.930 [2024-07-12 10:51:21.112316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:45.930 [2024-07-12 10:51:21.112372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:45.930 [2024-07-12 10:51:21.112418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:45.930 [2024-07-12 10:51:21.112629] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c8c8a0 00:24:45.930 [2024-07-12 10:51:21.112641] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:45.930 [2024-07-12 10:51:21.112849] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d06e10 00:24:45.930 [2024-07-12 10:51:21.113000] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c8c8a0 00:24:45.930 [2024-07-12 10:51:21.113011] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c8c8a0 00:24:45.930 [2024-07-12 10:51:21.113106] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:46.188 "name": "raid_bdev1", 00:24:46.188 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:24:46.188 "strip_size_kb": 0, 00:24:46.188 "state": "online", 00:24:46.188 "raid_level": "raid1", 00:24:46.188 "superblock": true, 00:24:46.188 "num_base_bdevs": 4, 00:24:46.188 "num_base_bdevs_discovered": 4, 00:24:46.188 "num_base_bdevs_operational": 4, 00:24:46.188 "base_bdevs_list": [ 00:24:46.188 { 00:24:46.188 "name": "BaseBdev1", 00:24:46.188 "uuid": "f5f45911-49fc-51de-8012-67d68692ad4a", 00:24:46.188 "is_configured": true, 00:24:46.188 "data_offset": 2048, 00:24:46.188 "data_size": 63488 00:24:46.188 }, 00:24:46.188 { 00:24:46.188 "name": "BaseBdev2", 00:24:46.188 "uuid": "67cf78f2-ac4b-50a0-8208-e0aab767805b", 00:24:46.188 "is_configured": true, 00:24:46.188 "data_offset": 2048, 00:24:46.188 "data_size": 63488 00:24:46.188 }, 00:24:46.188 { 00:24:46.188 "name": "BaseBdev3", 00:24:46.188 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:24:46.188 "is_configured": true, 00:24:46.188 "data_offset": 2048, 00:24:46.188 "data_size": 63488 00:24:46.188 }, 00:24:46.188 { 00:24:46.188 "name": "BaseBdev4", 00:24:46.188 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:24:46.188 "is_configured": true, 00:24:46.188 "data_offset": 2048, 00:24:46.188 "data_size": 63488 00:24:46.188 } 00:24:46.188 ] 00:24:46.188 }' 00:24:46.188 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:46.189 10:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:46.754 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:46.754 10:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:47.012 [2024-07-12 10:51:22.041750] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:47.012 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:47.012 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.012 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:47.270 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:47.270 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:47.270 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:47.270 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:47.270 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:47.270 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:47.271 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:47.271 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:47.271 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:47.271 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:47.271 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:47.271 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:47.271 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:47.271 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:47.529 [2024-07-12 10:51:22.538821] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d06e10 00:24:47.529 /dev/nbd0 00:24:47.529 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:47.530 1+0 records in 00:24:47.530 1+0 records out 00:24:47.530 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277872 s, 14.7 MB/s 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:47.530 10:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:24:55.637 63488+0 records in 00:24:55.637 63488+0 records out 00:24:55.637 32505856 bytes (33 MB, 31 MiB) copied, 6.8299 s, 4.8 MB/s 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:55.637 [2024-07-12 10:51:29.715809] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:55.637 [2024-07-12 10:51:29.952452] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:55.637 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:55.638 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:55.638 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:55.638 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:55.638 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:55.638 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:55.638 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:55.638 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.638 10:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.638 10:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:55.638 "name": "raid_bdev1", 00:24:55.638 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:24:55.638 "strip_size_kb": 0, 00:24:55.638 "state": "online", 00:24:55.638 "raid_level": "raid1", 00:24:55.638 "superblock": true, 00:24:55.638 "num_base_bdevs": 4, 00:24:55.638 "num_base_bdevs_discovered": 3, 00:24:55.638 "num_base_bdevs_operational": 3, 00:24:55.638 "base_bdevs_list": [ 00:24:55.638 { 00:24:55.638 "name": null, 00:24:55.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.638 "is_configured": false, 00:24:55.638 "data_offset": 2048, 00:24:55.638 "data_size": 63488 00:24:55.638 }, 00:24:55.638 { 00:24:55.638 "name": "BaseBdev2", 00:24:55.638 "uuid": "67cf78f2-ac4b-50a0-8208-e0aab767805b", 00:24:55.638 "is_configured": true, 00:24:55.638 "data_offset": 2048, 00:24:55.638 "data_size": 63488 00:24:55.638 }, 00:24:55.638 { 00:24:55.638 "name": "BaseBdev3", 00:24:55.638 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:24:55.638 "is_configured": true, 00:24:55.638 "data_offset": 2048, 00:24:55.638 "data_size": 63488 00:24:55.638 }, 00:24:55.638 { 00:24:55.638 "name": "BaseBdev4", 00:24:55.638 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:24:55.638 "is_configured": true, 00:24:55.638 "data_offset": 2048, 00:24:55.638 "data_size": 63488 00:24:55.638 } 00:24:55.638 ] 00:24:55.638 }' 00:24:55.638 10:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:55.638 10:51:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:55.638 10:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:55.895 [2024-07-12 10:51:30.866884] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:55.895 [2024-07-12 10:51:30.871026] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d06e10 00:24:55.895 [2024-07-12 10:51:30.873397] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:55.895 10:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:56.918 10:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:56.918 10:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:56.918 10:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:56.918 10:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:56.918 10:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:56.919 10:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.919 10:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.179 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:57.179 "name": "raid_bdev1", 00:24:57.179 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:24:57.179 "strip_size_kb": 0, 00:24:57.180 "state": "online", 00:24:57.180 "raid_level": "raid1", 00:24:57.180 "superblock": true, 00:24:57.180 "num_base_bdevs": 4, 00:24:57.180 "num_base_bdevs_discovered": 4, 00:24:57.180 "num_base_bdevs_operational": 4, 00:24:57.180 "process": { 00:24:57.180 "type": "rebuild", 00:24:57.180 "target": "spare", 00:24:57.180 "progress": { 00:24:57.180 "blocks": 24576, 00:24:57.180 "percent": 38 00:24:57.180 } 00:24:57.180 }, 00:24:57.180 "base_bdevs_list": [ 00:24:57.180 { 00:24:57.180 "name": "spare", 00:24:57.180 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:24:57.180 "is_configured": true, 00:24:57.180 "data_offset": 2048, 00:24:57.180 "data_size": 63488 00:24:57.180 }, 00:24:57.180 { 00:24:57.180 "name": "BaseBdev2", 00:24:57.180 "uuid": "67cf78f2-ac4b-50a0-8208-e0aab767805b", 00:24:57.180 "is_configured": true, 00:24:57.180 "data_offset": 2048, 00:24:57.180 "data_size": 63488 00:24:57.180 }, 00:24:57.180 { 00:24:57.180 "name": "BaseBdev3", 00:24:57.180 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:24:57.180 "is_configured": true, 00:24:57.180 "data_offset": 2048, 00:24:57.180 "data_size": 63488 00:24:57.180 }, 00:24:57.180 { 00:24:57.180 "name": "BaseBdev4", 00:24:57.180 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:24:57.180 "is_configured": true, 00:24:57.180 "data_offset": 2048, 00:24:57.180 "data_size": 63488 00:24:57.180 } 00:24:57.180 ] 00:24:57.180 }' 00:24:57.180 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:57.180 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:57.180 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:57.180 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:57.180 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:57.437 [2024-07-12 10:51:32.396404] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:57.437 [2024-07-12 10:51:32.486035] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:57.437 [2024-07-12 10:51:32.486088] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:57.437 [2024-07-12 10:51:32.486106] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:57.437 [2024-07-12 10:51:32.486115] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.437 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.695 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:57.695 "name": "raid_bdev1", 00:24:57.695 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:24:57.695 "strip_size_kb": 0, 00:24:57.695 "state": "online", 00:24:57.695 "raid_level": "raid1", 00:24:57.695 "superblock": true, 00:24:57.695 "num_base_bdevs": 4, 00:24:57.695 "num_base_bdevs_discovered": 3, 00:24:57.695 "num_base_bdevs_operational": 3, 00:24:57.695 "base_bdevs_list": [ 00:24:57.695 { 00:24:57.695 "name": null, 00:24:57.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:57.695 "is_configured": false, 00:24:57.695 "data_offset": 2048, 00:24:57.695 "data_size": 63488 00:24:57.695 }, 00:24:57.695 { 00:24:57.695 "name": "BaseBdev2", 00:24:57.695 "uuid": "67cf78f2-ac4b-50a0-8208-e0aab767805b", 00:24:57.695 "is_configured": true, 00:24:57.695 "data_offset": 2048, 00:24:57.695 "data_size": 63488 00:24:57.695 }, 00:24:57.695 { 00:24:57.695 "name": "BaseBdev3", 00:24:57.695 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:24:57.695 "is_configured": true, 00:24:57.695 "data_offset": 2048, 00:24:57.695 "data_size": 63488 00:24:57.695 }, 00:24:57.695 { 00:24:57.695 "name": "BaseBdev4", 00:24:57.695 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:24:57.695 "is_configured": true, 00:24:57.695 "data_offset": 2048, 00:24:57.695 "data_size": 63488 00:24:57.695 } 00:24:57.695 ] 00:24:57.695 }' 00:24:57.695 10:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:57.695 10:51:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:58.260 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:58.260 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.260 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:58.260 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:58.260 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.260 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.260 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.516 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.516 "name": "raid_bdev1", 00:24:58.516 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:24:58.516 "strip_size_kb": 0, 00:24:58.516 "state": "online", 00:24:58.516 "raid_level": "raid1", 00:24:58.516 "superblock": true, 00:24:58.516 "num_base_bdevs": 4, 00:24:58.516 "num_base_bdevs_discovered": 3, 00:24:58.516 "num_base_bdevs_operational": 3, 00:24:58.516 "base_bdevs_list": [ 00:24:58.516 { 00:24:58.516 "name": null, 00:24:58.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.516 "is_configured": false, 00:24:58.516 "data_offset": 2048, 00:24:58.516 "data_size": 63488 00:24:58.516 }, 00:24:58.516 { 00:24:58.516 "name": "BaseBdev2", 00:24:58.516 "uuid": "67cf78f2-ac4b-50a0-8208-e0aab767805b", 00:24:58.516 "is_configured": true, 00:24:58.516 "data_offset": 2048, 00:24:58.516 "data_size": 63488 00:24:58.516 }, 00:24:58.516 { 00:24:58.516 "name": "BaseBdev3", 00:24:58.516 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:24:58.516 "is_configured": true, 00:24:58.516 "data_offset": 2048, 00:24:58.516 "data_size": 63488 00:24:58.516 }, 00:24:58.516 { 00:24:58.516 "name": "BaseBdev4", 00:24:58.516 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:24:58.516 "is_configured": true, 00:24:58.516 "data_offset": 2048, 00:24:58.516 "data_size": 63488 00:24:58.516 } 00:24:58.516 ] 00:24:58.516 }' 00:24:58.516 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.516 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:58.516 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.516 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:58.516 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:58.773 [2024-07-12 10:51:33.801959] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:58.773 [2024-07-12 10:51:33.806083] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c8ce90 00:24:58.773 [2024-07-12 10:51:33.807593] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:58.773 10:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:59.713 10:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:59.713 10:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.713 10:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:59.713 10:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:59.713 10:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.713 10:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.713 10:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.975 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.975 "name": "raid_bdev1", 00:24:59.975 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:24:59.975 "strip_size_kb": 0, 00:24:59.975 "state": "online", 00:24:59.975 "raid_level": "raid1", 00:24:59.975 "superblock": true, 00:24:59.975 "num_base_bdevs": 4, 00:24:59.975 "num_base_bdevs_discovered": 4, 00:24:59.975 "num_base_bdevs_operational": 4, 00:24:59.975 "process": { 00:24:59.975 "type": "rebuild", 00:24:59.975 "target": "spare", 00:24:59.975 "progress": { 00:24:59.975 "blocks": 22528, 00:24:59.975 "percent": 35 00:24:59.975 } 00:24:59.975 }, 00:24:59.975 "base_bdevs_list": [ 00:24:59.975 { 00:24:59.975 "name": "spare", 00:24:59.975 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:24:59.975 "is_configured": true, 00:24:59.975 "data_offset": 2048, 00:24:59.975 "data_size": 63488 00:24:59.975 }, 00:24:59.975 { 00:24:59.975 "name": "BaseBdev2", 00:24:59.975 "uuid": "67cf78f2-ac4b-50a0-8208-e0aab767805b", 00:24:59.975 "is_configured": true, 00:24:59.975 "data_offset": 2048, 00:24:59.975 "data_size": 63488 00:24:59.975 }, 00:24:59.975 { 00:24:59.975 "name": "BaseBdev3", 00:24:59.975 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:24:59.975 "is_configured": true, 00:24:59.975 "data_offset": 2048, 00:24:59.975 "data_size": 63488 00:24:59.975 }, 00:24:59.975 { 00:24:59.975 "name": "BaseBdev4", 00:24:59.975 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:24:59.975 "is_configured": true, 00:24:59.975 "data_offset": 2048, 00:24:59.975 "data_size": 63488 00:24:59.975 } 00:24:59.975 ] 00:24:59.975 }' 00:24:59.975 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.975 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:59.975 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.975 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:59.975 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:59.976 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:59.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:59.976 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:59.976 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:59.976 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:59.976 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:00.233 [2024-07-12 10:51:35.335103] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:00.490 [2024-07-12 10:51:35.520660] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1c8ce90 00:25:00.490 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:00.490 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:00.490 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.490 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.490 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.490 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.490 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.490 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.490 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:00.747 "name": "raid_bdev1", 00:25:00.747 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:00.747 "strip_size_kb": 0, 00:25:00.747 "state": "online", 00:25:00.747 "raid_level": "raid1", 00:25:00.747 "superblock": true, 00:25:00.747 "num_base_bdevs": 4, 00:25:00.747 "num_base_bdevs_discovered": 3, 00:25:00.747 "num_base_bdevs_operational": 3, 00:25:00.747 "process": { 00:25:00.747 "type": "rebuild", 00:25:00.747 "target": "spare", 00:25:00.747 "progress": { 00:25:00.747 "blocks": 36864, 00:25:00.747 "percent": 58 00:25:00.747 } 00:25:00.747 }, 00:25:00.747 "base_bdevs_list": [ 00:25:00.747 { 00:25:00.747 "name": "spare", 00:25:00.747 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:25:00.747 "is_configured": true, 00:25:00.747 "data_offset": 2048, 00:25:00.747 "data_size": 63488 00:25:00.747 }, 00:25:00.747 { 00:25:00.747 "name": null, 00:25:00.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:00.747 "is_configured": false, 00:25:00.747 "data_offset": 2048, 00:25:00.747 "data_size": 63488 00:25:00.747 }, 00:25:00.747 { 00:25:00.747 "name": "BaseBdev3", 00:25:00.747 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:00.747 "is_configured": true, 00:25:00.747 "data_offset": 2048, 00:25:00.747 "data_size": 63488 00:25:00.747 }, 00:25:00.747 { 00:25:00.747 "name": "BaseBdev4", 00:25:00.747 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:00.747 "is_configured": true, 00:25:00.747 "data_offset": 2048, 00:25:00.747 "data_size": 63488 00:25:00.747 } 00:25:00.747 ] 00:25:00.747 }' 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=891 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.747 10:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.004 10:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.004 "name": "raid_bdev1", 00:25:01.004 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:01.004 "strip_size_kb": 0, 00:25:01.004 "state": "online", 00:25:01.004 "raid_level": "raid1", 00:25:01.004 "superblock": true, 00:25:01.004 "num_base_bdevs": 4, 00:25:01.004 "num_base_bdevs_discovered": 3, 00:25:01.004 "num_base_bdevs_operational": 3, 00:25:01.004 "process": { 00:25:01.004 "type": "rebuild", 00:25:01.004 "target": "spare", 00:25:01.004 "progress": { 00:25:01.004 "blocks": 45056, 00:25:01.004 "percent": 70 00:25:01.004 } 00:25:01.004 }, 00:25:01.004 "base_bdevs_list": [ 00:25:01.004 { 00:25:01.004 "name": "spare", 00:25:01.004 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:25:01.004 "is_configured": true, 00:25:01.005 "data_offset": 2048, 00:25:01.005 "data_size": 63488 00:25:01.005 }, 00:25:01.005 { 00:25:01.005 "name": null, 00:25:01.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:01.005 "is_configured": false, 00:25:01.005 "data_offset": 2048, 00:25:01.005 "data_size": 63488 00:25:01.005 }, 00:25:01.005 { 00:25:01.005 "name": "BaseBdev3", 00:25:01.005 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:01.005 "is_configured": true, 00:25:01.005 "data_offset": 2048, 00:25:01.005 "data_size": 63488 00:25:01.005 }, 00:25:01.005 { 00:25:01.005 "name": "BaseBdev4", 00:25:01.005 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:01.005 "is_configured": true, 00:25:01.005 "data_offset": 2048, 00:25:01.005 "data_size": 63488 00:25:01.005 } 00:25:01.005 ] 00:25:01.005 }' 00:25:01.005 10:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.005 10:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.005 10:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.261 10:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.261 10:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:01.857 [2024-07-12 10:51:37.032528] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:01.857 [2024-07-12 10:51:37.032595] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:01.857 [2024-07-12 10:51:37.032689] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:02.114 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:02.114 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:02.114 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:02.114 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:02.114 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:02.114 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:02.114 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.114 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:02.370 "name": "raid_bdev1", 00:25:02.370 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:02.370 "strip_size_kb": 0, 00:25:02.370 "state": "online", 00:25:02.370 "raid_level": "raid1", 00:25:02.370 "superblock": true, 00:25:02.370 "num_base_bdevs": 4, 00:25:02.370 "num_base_bdevs_discovered": 3, 00:25:02.370 "num_base_bdevs_operational": 3, 00:25:02.370 "base_bdevs_list": [ 00:25:02.370 { 00:25:02.370 "name": "spare", 00:25:02.370 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:25:02.370 "is_configured": true, 00:25:02.370 "data_offset": 2048, 00:25:02.370 "data_size": 63488 00:25:02.370 }, 00:25:02.370 { 00:25:02.370 "name": null, 00:25:02.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.370 "is_configured": false, 00:25:02.370 "data_offset": 2048, 00:25:02.370 "data_size": 63488 00:25:02.370 }, 00:25:02.370 { 00:25:02.370 "name": "BaseBdev3", 00:25:02.370 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:02.370 "is_configured": true, 00:25:02.370 "data_offset": 2048, 00:25:02.370 "data_size": 63488 00:25:02.370 }, 00:25:02.370 { 00:25:02.370 "name": "BaseBdev4", 00:25:02.370 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:02.370 "is_configured": true, 00:25:02.370 "data_offset": 2048, 00:25:02.370 "data_size": 63488 00:25:02.370 } 00:25:02.370 ] 00:25:02.370 }' 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.370 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.627 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:02.627 "name": "raid_bdev1", 00:25:02.627 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:02.627 "strip_size_kb": 0, 00:25:02.627 "state": "online", 00:25:02.627 "raid_level": "raid1", 00:25:02.627 "superblock": true, 00:25:02.627 "num_base_bdevs": 4, 00:25:02.627 "num_base_bdevs_discovered": 3, 00:25:02.627 "num_base_bdevs_operational": 3, 00:25:02.627 "base_bdevs_list": [ 00:25:02.627 { 00:25:02.627 "name": "spare", 00:25:02.627 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:25:02.627 "is_configured": true, 00:25:02.627 "data_offset": 2048, 00:25:02.627 "data_size": 63488 00:25:02.627 }, 00:25:02.627 { 00:25:02.627 "name": null, 00:25:02.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.627 "is_configured": false, 00:25:02.627 "data_offset": 2048, 00:25:02.627 "data_size": 63488 00:25:02.627 }, 00:25:02.627 { 00:25:02.627 "name": "BaseBdev3", 00:25:02.627 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:02.627 "is_configured": true, 00:25:02.627 "data_offset": 2048, 00:25:02.627 "data_size": 63488 00:25:02.627 }, 00:25:02.627 { 00:25:02.627 "name": "BaseBdev4", 00:25:02.627 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:02.627 "is_configured": true, 00:25:02.627 "data_offset": 2048, 00:25:02.627 "data_size": 63488 00:25:02.627 } 00:25:02.627 ] 00:25:02.627 }' 00:25:02.627 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.885 10:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.143 10:51:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:03.143 "name": "raid_bdev1", 00:25:03.143 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:03.143 "strip_size_kb": 0, 00:25:03.143 "state": "online", 00:25:03.143 "raid_level": "raid1", 00:25:03.143 "superblock": true, 00:25:03.143 "num_base_bdevs": 4, 00:25:03.143 "num_base_bdevs_discovered": 3, 00:25:03.143 "num_base_bdevs_operational": 3, 00:25:03.143 "base_bdevs_list": [ 00:25:03.143 { 00:25:03.143 "name": "spare", 00:25:03.143 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:25:03.143 "is_configured": true, 00:25:03.143 "data_offset": 2048, 00:25:03.143 "data_size": 63488 00:25:03.143 }, 00:25:03.143 { 00:25:03.143 "name": null, 00:25:03.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:03.143 "is_configured": false, 00:25:03.143 "data_offset": 2048, 00:25:03.143 "data_size": 63488 00:25:03.143 }, 00:25:03.143 { 00:25:03.143 "name": "BaseBdev3", 00:25:03.143 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:03.143 "is_configured": true, 00:25:03.143 "data_offset": 2048, 00:25:03.143 "data_size": 63488 00:25:03.143 }, 00:25:03.143 { 00:25:03.143 "name": "BaseBdev4", 00:25:03.143 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:03.143 "is_configured": true, 00:25:03.143 "data_offset": 2048, 00:25:03.143 "data_size": 63488 00:25:03.143 } 00:25:03.143 ] 00:25:03.143 }' 00:25:03.143 10:51:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:03.143 10:51:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:03.709 10:51:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:03.709 [2024-07-12 10:51:38.901793] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:03.709 [2024-07-12 10:51:38.901825] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:03.709 [2024-07-12 10:51:38.901886] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:03.709 [2024-07-12 10:51:38.901954] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:03.709 [2024-07-12 10:51:38.901966] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c8c8a0 name raid_bdev1, state offline 00:25:03.968 10:51:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.968 10:51:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:04.227 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:04.227 /dev/nbd0 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:04.486 1+0 records in 00:25:04.486 1+0 records out 00:25:04.486 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238073 s, 17.2 MB/s 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:04.486 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:04.745 /dev/nbd1 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:04.745 1+0 records in 00:25:04.745 1+0 records out 00:25:04.745 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000334779 s, 12.2 MB/s 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:04.745 10:51:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:05.004 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:05.263 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:05.521 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:05.780 [2024-07-12 10:51:40.752837] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:05.780 [2024-07-12 10:51:40.752890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:05.780 [2024-07-12 10:51:40.752913] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d06b40 00:25:05.780 [2024-07-12 10:51:40.752926] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:05.780 [2024-07-12 10:51:40.754586] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:05.780 [2024-07-12 10:51:40.754617] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:05.780 [2024-07-12 10:51:40.754704] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:05.780 [2024-07-12 10:51:40.754731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:05.780 [2024-07-12 10:51:40.754838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:05.780 [2024-07-12 10:51:40.754911] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:05.780 spare 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.780 10:51:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.780 [2024-07-12 10:51:40.855228] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c90ba0 00:25:05.780 [2024-07-12 10:51:40.855248] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:05.780 [2024-07-12 10:51:40.855459] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d06550 00:25:05.780 [2024-07-12 10:51:40.855621] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c90ba0 00:25:05.780 [2024-07-12 10:51:40.855638] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c90ba0 00:25:05.780 [2024-07-12 10:51:40.855743] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:06.040 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:06.040 "name": "raid_bdev1", 00:25:06.040 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:06.040 "strip_size_kb": 0, 00:25:06.040 "state": "online", 00:25:06.040 "raid_level": "raid1", 00:25:06.040 "superblock": true, 00:25:06.040 "num_base_bdevs": 4, 00:25:06.040 "num_base_bdevs_discovered": 3, 00:25:06.040 "num_base_bdevs_operational": 3, 00:25:06.040 "base_bdevs_list": [ 00:25:06.040 { 00:25:06.040 "name": "spare", 00:25:06.040 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:25:06.040 "is_configured": true, 00:25:06.040 "data_offset": 2048, 00:25:06.040 "data_size": 63488 00:25:06.040 }, 00:25:06.040 { 00:25:06.040 "name": null, 00:25:06.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:06.040 "is_configured": false, 00:25:06.040 "data_offset": 2048, 00:25:06.040 "data_size": 63488 00:25:06.040 }, 00:25:06.040 { 00:25:06.040 "name": "BaseBdev3", 00:25:06.040 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:06.040 "is_configured": true, 00:25:06.040 "data_offset": 2048, 00:25:06.040 "data_size": 63488 00:25:06.040 }, 00:25:06.040 { 00:25:06.040 "name": "BaseBdev4", 00:25:06.040 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:06.040 "is_configured": true, 00:25:06.040 "data_offset": 2048, 00:25:06.040 "data_size": 63488 00:25:06.040 } 00:25:06.040 ] 00:25:06.040 }' 00:25:06.040 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:06.040 10:51:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:06.608 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:06.608 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.608 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:06.608 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:06.608 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.608 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.608 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.867 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:06.867 "name": "raid_bdev1", 00:25:06.867 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:06.867 "strip_size_kb": 0, 00:25:06.867 "state": "online", 00:25:06.867 "raid_level": "raid1", 00:25:06.867 "superblock": true, 00:25:06.867 "num_base_bdevs": 4, 00:25:06.867 "num_base_bdevs_discovered": 3, 00:25:06.867 "num_base_bdevs_operational": 3, 00:25:06.867 "base_bdevs_list": [ 00:25:06.867 { 00:25:06.867 "name": "spare", 00:25:06.867 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:25:06.867 "is_configured": true, 00:25:06.867 "data_offset": 2048, 00:25:06.867 "data_size": 63488 00:25:06.867 }, 00:25:06.867 { 00:25:06.867 "name": null, 00:25:06.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:06.867 "is_configured": false, 00:25:06.867 "data_offset": 2048, 00:25:06.867 "data_size": 63488 00:25:06.867 }, 00:25:06.867 { 00:25:06.867 "name": "BaseBdev3", 00:25:06.867 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:06.867 "is_configured": true, 00:25:06.867 "data_offset": 2048, 00:25:06.867 "data_size": 63488 00:25:06.867 }, 00:25:06.867 { 00:25:06.867 "name": "BaseBdev4", 00:25:06.867 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:06.867 "is_configured": true, 00:25:06.867 "data_offset": 2048, 00:25:06.867 "data_size": 63488 00:25:06.867 } 00:25:06.867 ] 00:25:06.867 }' 00:25:06.867 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:06.867 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:06.867 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:06.867 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:06.867 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.867 10:51:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:07.126 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:07.126 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:07.385 [2024-07-12 10:51:42.445451] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.385 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.644 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.644 "name": "raid_bdev1", 00:25:07.644 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:07.644 "strip_size_kb": 0, 00:25:07.644 "state": "online", 00:25:07.644 "raid_level": "raid1", 00:25:07.644 "superblock": true, 00:25:07.644 "num_base_bdevs": 4, 00:25:07.644 "num_base_bdevs_discovered": 2, 00:25:07.644 "num_base_bdevs_operational": 2, 00:25:07.644 "base_bdevs_list": [ 00:25:07.644 { 00:25:07.644 "name": null, 00:25:07.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.644 "is_configured": false, 00:25:07.644 "data_offset": 2048, 00:25:07.644 "data_size": 63488 00:25:07.644 }, 00:25:07.644 { 00:25:07.644 "name": null, 00:25:07.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.644 "is_configured": false, 00:25:07.644 "data_offset": 2048, 00:25:07.644 "data_size": 63488 00:25:07.644 }, 00:25:07.644 { 00:25:07.644 "name": "BaseBdev3", 00:25:07.644 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:07.644 "is_configured": true, 00:25:07.644 "data_offset": 2048, 00:25:07.644 "data_size": 63488 00:25:07.644 }, 00:25:07.644 { 00:25:07.644 "name": "BaseBdev4", 00:25:07.644 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:07.644 "is_configured": true, 00:25:07.644 "data_offset": 2048, 00:25:07.644 "data_size": 63488 00:25:07.644 } 00:25:07.644 ] 00:25:07.644 }' 00:25:07.644 10:51:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.644 10:51:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:08.210 10:51:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:08.468 [2024-07-12 10:51:43.528340] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.468 [2024-07-12 10:51:43.528501] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:08.468 [2024-07-12 10:51:43.528520] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:08.468 [2024-07-12 10:51:43.528548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.468 [2024-07-12 10:51:43.532530] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c90740 00:25:08.468 [2024-07-12 10:51:43.534901] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:08.468 10:51:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:09.401 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.401 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.401 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.401 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.401 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.401 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.401 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.659 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.659 "name": "raid_bdev1", 00:25:09.659 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:09.659 "strip_size_kb": 0, 00:25:09.659 "state": "online", 00:25:09.659 "raid_level": "raid1", 00:25:09.660 "superblock": true, 00:25:09.660 "num_base_bdevs": 4, 00:25:09.660 "num_base_bdevs_discovered": 3, 00:25:09.660 "num_base_bdevs_operational": 3, 00:25:09.660 "process": { 00:25:09.660 "type": "rebuild", 00:25:09.660 "target": "spare", 00:25:09.660 "progress": { 00:25:09.660 "blocks": 24576, 00:25:09.660 "percent": 38 00:25:09.660 } 00:25:09.660 }, 00:25:09.660 "base_bdevs_list": [ 00:25:09.660 { 00:25:09.660 "name": "spare", 00:25:09.660 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:25:09.660 "is_configured": true, 00:25:09.660 "data_offset": 2048, 00:25:09.660 "data_size": 63488 00:25:09.660 }, 00:25:09.660 { 00:25:09.660 "name": null, 00:25:09.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:09.660 "is_configured": false, 00:25:09.660 "data_offset": 2048, 00:25:09.660 "data_size": 63488 00:25:09.660 }, 00:25:09.660 { 00:25:09.660 "name": "BaseBdev3", 00:25:09.660 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:09.660 "is_configured": true, 00:25:09.660 "data_offset": 2048, 00:25:09.660 "data_size": 63488 00:25:09.660 }, 00:25:09.660 { 00:25:09.660 "name": "BaseBdev4", 00:25:09.660 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:09.660 "is_configured": true, 00:25:09.660 "data_offset": 2048, 00:25:09.660 "data_size": 63488 00:25:09.660 } 00:25:09.660 ] 00:25:09.660 }' 00:25:09.660 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.660 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:09.660 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.918 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:09.918 10:51:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:09.918 [2024-07-12 10:51:45.109747] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:10.176 [2024-07-12 10:51:45.147191] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:10.176 [2024-07-12 10:51:45.147233] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:10.176 [2024-07-12 10:51:45.147250] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:10.176 [2024-07-12 10:51:45.147259] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.176 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.435 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.435 "name": "raid_bdev1", 00:25:10.435 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:10.435 "strip_size_kb": 0, 00:25:10.435 "state": "online", 00:25:10.435 "raid_level": "raid1", 00:25:10.435 "superblock": true, 00:25:10.435 "num_base_bdevs": 4, 00:25:10.435 "num_base_bdevs_discovered": 2, 00:25:10.435 "num_base_bdevs_operational": 2, 00:25:10.435 "base_bdevs_list": [ 00:25:10.435 { 00:25:10.435 "name": null, 00:25:10.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.435 "is_configured": false, 00:25:10.435 "data_offset": 2048, 00:25:10.435 "data_size": 63488 00:25:10.435 }, 00:25:10.435 { 00:25:10.435 "name": null, 00:25:10.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.435 "is_configured": false, 00:25:10.435 "data_offset": 2048, 00:25:10.435 "data_size": 63488 00:25:10.435 }, 00:25:10.435 { 00:25:10.435 "name": "BaseBdev3", 00:25:10.435 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:10.435 "is_configured": true, 00:25:10.435 "data_offset": 2048, 00:25:10.435 "data_size": 63488 00:25:10.435 }, 00:25:10.435 { 00:25:10.435 "name": "BaseBdev4", 00:25:10.435 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:10.435 "is_configured": true, 00:25:10.435 "data_offset": 2048, 00:25:10.435 "data_size": 63488 00:25:10.435 } 00:25:10.435 ] 00:25:10.435 }' 00:25:10.435 10:51:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.435 10:51:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:11.001 10:51:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:11.001 [2024-07-12 10:51:46.182549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:11.001 [2024-07-12 10:51:46.182604] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:11.001 [2024-07-12 10:51:46.182628] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c8d650 00:25:11.001 [2024-07-12 10:51:46.182641] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:11.001 [2024-07-12 10:51:46.183022] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:11.001 [2024-07-12 10:51:46.183040] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:11.001 [2024-07-12 10:51:46.183121] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:11.001 [2024-07-12 10:51:46.183132] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:11.001 [2024-07-12 10:51:46.183143] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:11.001 [2024-07-12 10:51:46.183162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:11.001 [2024-07-12 10:51:46.187164] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c90580 00:25:11.001 spare 00:25:11.001 [2024-07-12 10:51:46.188562] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:11.278 10:51:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:12.231 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:12.231 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.231 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:12.231 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:12.231 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.231 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.231 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.489 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.489 "name": "raid_bdev1", 00:25:12.489 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:12.489 "strip_size_kb": 0, 00:25:12.489 "state": "online", 00:25:12.489 "raid_level": "raid1", 00:25:12.489 "superblock": true, 00:25:12.489 "num_base_bdevs": 4, 00:25:12.489 "num_base_bdevs_discovered": 3, 00:25:12.489 "num_base_bdevs_operational": 3, 00:25:12.489 "process": { 00:25:12.489 "type": "rebuild", 00:25:12.489 "target": "spare", 00:25:12.489 "progress": { 00:25:12.489 "blocks": 24576, 00:25:12.489 "percent": 38 00:25:12.489 } 00:25:12.489 }, 00:25:12.489 "base_bdevs_list": [ 00:25:12.489 { 00:25:12.489 "name": "spare", 00:25:12.489 "uuid": "01110f45-ff00-5d1f-b98d-e3a62facf1eb", 00:25:12.489 "is_configured": true, 00:25:12.489 "data_offset": 2048, 00:25:12.489 "data_size": 63488 00:25:12.489 }, 00:25:12.489 { 00:25:12.489 "name": null, 00:25:12.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.489 "is_configured": false, 00:25:12.489 "data_offset": 2048, 00:25:12.489 "data_size": 63488 00:25:12.489 }, 00:25:12.489 { 00:25:12.489 "name": "BaseBdev3", 00:25:12.489 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:12.489 "is_configured": true, 00:25:12.489 "data_offset": 2048, 00:25:12.489 "data_size": 63488 00:25:12.489 }, 00:25:12.489 { 00:25:12.489 "name": "BaseBdev4", 00:25:12.489 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:12.489 "is_configured": true, 00:25:12.489 "data_offset": 2048, 00:25:12.489 "data_size": 63488 00:25:12.489 } 00:25:12.489 ] 00:25:12.489 }' 00:25:12.489 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.489 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:12.489 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:12.489 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:12.489 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:12.747 [2024-07-12 10:51:47.704979] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:12.747 [2024-07-12 10:51:47.801207] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:12.747 [2024-07-12 10:51:47.801251] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:12.747 [2024-07-12 10:51:47.801267] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:12.747 [2024-07-12 10:51:47.801275] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.747 10:51:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.008 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:13.008 "name": "raid_bdev1", 00:25:13.008 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:13.008 "strip_size_kb": 0, 00:25:13.008 "state": "online", 00:25:13.008 "raid_level": "raid1", 00:25:13.008 "superblock": true, 00:25:13.008 "num_base_bdevs": 4, 00:25:13.008 "num_base_bdevs_discovered": 2, 00:25:13.008 "num_base_bdevs_operational": 2, 00:25:13.008 "base_bdevs_list": [ 00:25:13.008 { 00:25:13.008 "name": null, 00:25:13.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.008 "is_configured": false, 00:25:13.008 "data_offset": 2048, 00:25:13.008 "data_size": 63488 00:25:13.008 }, 00:25:13.008 { 00:25:13.008 "name": null, 00:25:13.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.008 "is_configured": false, 00:25:13.008 "data_offset": 2048, 00:25:13.008 "data_size": 63488 00:25:13.008 }, 00:25:13.008 { 00:25:13.008 "name": "BaseBdev3", 00:25:13.008 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:13.008 "is_configured": true, 00:25:13.008 "data_offset": 2048, 00:25:13.008 "data_size": 63488 00:25:13.008 }, 00:25:13.008 { 00:25:13.008 "name": "BaseBdev4", 00:25:13.008 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:13.008 "is_configured": true, 00:25:13.008 "data_offset": 2048, 00:25:13.008 "data_size": 63488 00:25:13.008 } 00:25:13.008 ] 00:25:13.008 }' 00:25:13.008 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:13.008 10:51:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:13.572 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.572 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.572 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.572 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.572 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.572 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.572 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.829 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.829 "name": "raid_bdev1", 00:25:13.829 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:13.829 "strip_size_kb": 0, 00:25:13.829 "state": "online", 00:25:13.829 "raid_level": "raid1", 00:25:13.829 "superblock": true, 00:25:13.829 "num_base_bdevs": 4, 00:25:13.829 "num_base_bdevs_discovered": 2, 00:25:13.829 "num_base_bdevs_operational": 2, 00:25:13.829 "base_bdevs_list": [ 00:25:13.829 { 00:25:13.829 "name": null, 00:25:13.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.829 "is_configured": false, 00:25:13.829 "data_offset": 2048, 00:25:13.830 "data_size": 63488 00:25:13.830 }, 00:25:13.830 { 00:25:13.830 "name": null, 00:25:13.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.830 "is_configured": false, 00:25:13.830 "data_offset": 2048, 00:25:13.830 "data_size": 63488 00:25:13.830 }, 00:25:13.830 { 00:25:13.830 "name": "BaseBdev3", 00:25:13.830 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:13.830 "is_configured": true, 00:25:13.830 "data_offset": 2048, 00:25:13.830 "data_size": 63488 00:25:13.830 }, 00:25:13.830 { 00:25:13.830 "name": "BaseBdev4", 00:25:13.830 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:13.830 "is_configured": true, 00:25:13.830 "data_offset": 2048, 00:25:13.830 "data_size": 63488 00:25:13.830 } 00:25:13.830 ] 00:25:13.830 }' 00:25:13.830 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.830 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:13.830 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.830 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.830 10:51:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:14.087 10:51:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:14.345 [2024-07-12 10:51:49.421561] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:14.345 [2024-07-12 10:51:49.421607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:14.345 [2024-07-12 10:51:49.421630] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c91010 00:25:14.345 [2024-07-12 10:51:49.421642] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:14.345 [2024-07-12 10:51:49.421981] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:14.345 [2024-07-12 10:51:49.421997] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:14.345 [2024-07-12 10:51:49.422059] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:14.345 [2024-07-12 10:51:49.422071] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:14.345 [2024-07-12 10:51:49.422088] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:14.345 BaseBdev1 00:25:14.345 10:51:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:15.277 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.278 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.535 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:15.535 "name": "raid_bdev1", 00:25:15.535 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:15.535 "strip_size_kb": 0, 00:25:15.535 "state": "online", 00:25:15.535 "raid_level": "raid1", 00:25:15.535 "superblock": true, 00:25:15.535 "num_base_bdevs": 4, 00:25:15.535 "num_base_bdevs_discovered": 2, 00:25:15.536 "num_base_bdevs_operational": 2, 00:25:15.536 "base_bdevs_list": [ 00:25:15.536 { 00:25:15.536 "name": null, 00:25:15.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.536 "is_configured": false, 00:25:15.536 "data_offset": 2048, 00:25:15.536 "data_size": 63488 00:25:15.536 }, 00:25:15.536 { 00:25:15.536 "name": null, 00:25:15.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.536 "is_configured": false, 00:25:15.536 "data_offset": 2048, 00:25:15.536 "data_size": 63488 00:25:15.536 }, 00:25:15.536 { 00:25:15.536 "name": "BaseBdev3", 00:25:15.536 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:15.536 "is_configured": true, 00:25:15.536 "data_offset": 2048, 00:25:15.536 "data_size": 63488 00:25:15.536 }, 00:25:15.536 { 00:25:15.536 "name": "BaseBdev4", 00:25:15.536 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:15.536 "is_configured": true, 00:25:15.536 "data_offset": 2048, 00:25:15.536 "data_size": 63488 00:25:15.536 } 00:25:15.536 ] 00:25:15.536 }' 00:25:15.536 10:51:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:15.536 10:51:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:16.469 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:16.469 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:16.469 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:16.469 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:16.469 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:16.469 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.469 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.469 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:16.469 "name": "raid_bdev1", 00:25:16.469 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:16.469 "strip_size_kb": 0, 00:25:16.469 "state": "online", 00:25:16.469 "raid_level": "raid1", 00:25:16.469 "superblock": true, 00:25:16.469 "num_base_bdevs": 4, 00:25:16.469 "num_base_bdevs_discovered": 2, 00:25:16.469 "num_base_bdevs_operational": 2, 00:25:16.469 "base_bdevs_list": [ 00:25:16.469 { 00:25:16.469 "name": null, 00:25:16.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:16.469 "is_configured": false, 00:25:16.469 "data_offset": 2048, 00:25:16.469 "data_size": 63488 00:25:16.469 }, 00:25:16.469 { 00:25:16.469 "name": null, 00:25:16.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:16.469 "is_configured": false, 00:25:16.469 "data_offset": 2048, 00:25:16.469 "data_size": 63488 00:25:16.469 }, 00:25:16.469 { 00:25:16.469 "name": "BaseBdev3", 00:25:16.469 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:16.469 "is_configured": true, 00:25:16.469 "data_offset": 2048, 00:25:16.469 "data_size": 63488 00:25:16.469 }, 00:25:16.470 { 00:25:16.470 "name": "BaseBdev4", 00:25:16.470 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:16.470 "is_configured": true, 00:25:16.470 "data_offset": 2048, 00:25:16.470 "data_size": 63488 00:25:16.470 } 00:25:16.470 ] 00:25:16.470 }' 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:16.470 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:16.728 [2024-07-12 10:51:51.872183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:16.728 [2024-07-12 10:51:51.872316] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:16.728 [2024-07-12 10:51:51.872332] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:16.728 request: 00:25:16.728 { 00:25:16.728 "base_bdev": "BaseBdev1", 00:25:16.728 "raid_bdev": "raid_bdev1", 00:25:16.728 "method": "bdev_raid_add_base_bdev", 00:25:16.728 "req_id": 1 00:25:16.728 } 00:25:16.728 Got JSON-RPC error response 00:25:16.728 response: 00:25:16.728 { 00:25:16.728 "code": -22, 00:25:16.728 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:16.728 } 00:25:16.728 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:25:16.728 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:16.728 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:16.728 10:51:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:16.728 10:51:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.101 10:51:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.101 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:18.101 "name": "raid_bdev1", 00:25:18.101 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:18.101 "strip_size_kb": 0, 00:25:18.101 "state": "online", 00:25:18.101 "raid_level": "raid1", 00:25:18.101 "superblock": true, 00:25:18.101 "num_base_bdevs": 4, 00:25:18.101 "num_base_bdevs_discovered": 2, 00:25:18.101 "num_base_bdevs_operational": 2, 00:25:18.101 "base_bdevs_list": [ 00:25:18.101 { 00:25:18.101 "name": null, 00:25:18.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.101 "is_configured": false, 00:25:18.101 "data_offset": 2048, 00:25:18.101 "data_size": 63488 00:25:18.101 }, 00:25:18.101 { 00:25:18.101 "name": null, 00:25:18.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.101 "is_configured": false, 00:25:18.101 "data_offset": 2048, 00:25:18.101 "data_size": 63488 00:25:18.101 }, 00:25:18.101 { 00:25:18.101 "name": "BaseBdev3", 00:25:18.101 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:18.101 "is_configured": true, 00:25:18.101 "data_offset": 2048, 00:25:18.101 "data_size": 63488 00:25:18.101 }, 00:25:18.101 { 00:25:18.101 "name": "BaseBdev4", 00:25:18.101 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:18.101 "is_configured": true, 00:25:18.101 "data_offset": 2048, 00:25:18.101 "data_size": 63488 00:25:18.101 } 00:25:18.101 ] 00:25:18.101 }' 00:25:18.101 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:18.101 10:51:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:18.668 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:18.668 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:18.668 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:18.668 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:18.668 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:18.668 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.668 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.927 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:18.927 "name": "raid_bdev1", 00:25:18.927 "uuid": "b6f1f2cd-8e1d-4dc0-88f8-bb23b924f723", 00:25:18.927 "strip_size_kb": 0, 00:25:18.927 "state": "online", 00:25:18.927 "raid_level": "raid1", 00:25:18.927 "superblock": true, 00:25:18.927 "num_base_bdevs": 4, 00:25:18.927 "num_base_bdevs_discovered": 2, 00:25:18.927 "num_base_bdevs_operational": 2, 00:25:18.927 "base_bdevs_list": [ 00:25:18.927 { 00:25:18.927 "name": null, 00:25:18.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.927 "is_configured": false, 00:25:18.927 "data_offset": 2048, 00:25:18.927 "data_size": 63488 00:25:18.927 }, 00:25:18.927 { 00:25:18.927 "name": null, 00:25:18.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.927 "is_configured": false, 00:25:18.927 "data_offset": 2048, 00:25:18.927 "data_size": 63488 00:25:18.927 }, 00:25:18.927 { 00:25:18.927 "name": "BaseBdev3", 00:25:18.927 "uuid": "206e89ea-b3d6-5490-a8fa-e8c2acf65062", 00:25:18.927 "is_configured": true, 00:25:18.927 "data_offset": 2048, 00:25:18.927 "data_size": 63488 00:25:18.927 }, 00:25:18.927 { 00:25:18.927 "name": "BaseBdev4", 00:25:18.927 "uuid": "aca551e1-97cc-5f33-988d-828873639ce1", 00:25:18.927 "is_configured": true, 00:25:18.927 "data_offset": 2048, 00:25:18.927 "data_size": 63488 00:25:18.927 } 00:25:18.927 ] 00:25:18.927 }' 00:25:18.927 10:51:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2140399 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2140399 ']' 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2140399 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2140399 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2140399' 00:25:18.927 killing process with pid 2140399 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2140399 00:25:18.927 Received shutdown signal, test time was about 60.000000 seconds 00:25:18.927 00:25:18.927 Latency(us) 00:25:18.927 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:18.927 =================================================================================================================== 00:25:18.927 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:18.927 [2024-07-12 10:51:54.098092] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:18.927 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2140399 00:25:18.927 [2024-07-12 10:51:54.098190] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:18.927 [2024-07-12 10:51:54.098244] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:18.927 [2024-07-12 10:51:54.098256] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c90ba0 name raid_bdev1, state offline 00:25:19.187 [2024-07-12 10:51:54.153251] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:19.187 10:51:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:25:19.187 00:25:19.187 real 0m37.171s 00:25:19.187 user 0m53.739s 00:25:19.187 sys 0m6.654s 00:25:19.187 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:19.187 10:51:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:19.187 ************************************ 00:25:19.187 END TEST raid_rebuild_test_sb 00:25:19.187 ************************************ 00:25:19.446 10:51:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:19.446 10:51:54 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:25:19.446 10:51:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:19.446 10:51:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:19.446 10:51:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:19.446 ************************************ 00:25:19.446 START TEST raid_rebuild_test_io 00:25:19.446 ************************************ 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2145599 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2145599 /var/tmp/spdk-raid.sock 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2145599 ']' 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:19.446 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:19.446 10:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:19.446 [2024-07-12 10:51:54.533981] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:25:19.446 [2024-07-12 10:51:54.534047] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2145599 ] 00:25:19.446 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:19.446 Zero copy mechanism will not be used. 00:25:19.705 [2024-07-12 10:51:54.661901] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:19.705 [2024-07-12 10:51:54.764522] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:19.705 [2024-07-12 10:51:54.824355] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:19.705 [2024-07-12 10:51:54.824391] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:20.272 10:51:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:20.272 10:51:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:25:20.272 10:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:20.272 10:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:20.841 BaseBdev1_malloc 00:25:20.841 10:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:21.408 [2024-07-12 10:51:56.453410] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:21.408 [2024-07-12 10:51:56.453458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:21.408 [2024-07-12 10:51:56.453493] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e01d40 00:25:21.408 [2024-07-12 10:51:56.453507] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:21.408 [2024-07-12 10:51:56.455249] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:21.408 [2024-07-12 10:51:56.455278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:21.408 BaseBdev1 00:25:21.408 10:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:21.408 10:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:21.976 BaseBdev2_malloc 00:25:21.976 10:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:22.234 [2024-07-12 10:51:57.220264] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:22.234 [2024-07-12 10:51:57.220311] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:22.234 [2024-07-12 10:51:57.220336] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e02860 00:25:22.234 [2024-07-12 10:51:57.220350] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:22.234 [2024-07-12 10:51:57.221894] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:22.234 [2024-07-12 10:51:57.221921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:22.234 BaseBdev2 00:25:22.234 10:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:22.234 10:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:22.800 BaseBdev3_malloc 00:25:22.800 10:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:22.800 [2024-07-12 10:51:57.980083] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:22.800 [2024-07-12 10:51:57.980130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:22.800 [2024-07-12 10:51:57.980152] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1faf8f0 00:25:22.800 [2024-07-12 10:51:57.980165] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:22.800 [2024-07-12 10:51:57.981750] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:22.800 [2024-07-12 10:51:57.981778] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:22.800 BaseBdev3 00:25:23.057 10:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:23.057 10:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:23.315 BaseBdev4_malloc 00:25:23.572 10:51:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:23.572 [2024-07-12 10:51:58.739861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:23.572 [2024-07-12 10:51:58.739911] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:23.572 [2024-07-12 10:51:58.739932] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1faead0 00:25:23.573 [2024-07-12 10:51:58.739944] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:23.573 [2024-07-12 10:51:58.741478] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:23.573 [2024-07-12 10:51:58.741530] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:23.573 BaseBdev4 00:25:23.573 10:51:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:24.138 spare_malloc 00:25:24.138 10:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:24.396 spare_delay 00:25:24.396 10:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:24.655 [2024-07-12 10:51:59.731041] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:24.655 [2024-07-12 10:51:59.731092] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:24.655 [2024-07-12 10:51:59.731112] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fb35b0 00:25:24.655 [2024-07-12 10:51:59.731125] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:24.655 [2024-07-12 10:51:59.732738] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:24.655 [2024-07-12 10:51:59.732766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:24.655 spare 00:25:24.655 10:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:24.915 [2024-07-12 10:51:59.975706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:24.915 [2024-07-12 10:51:59.976890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:24.915 [2024-07-12 10:51:59.976944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:24.915 [2024-07-12 10:51:59.976989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:24.915 [2024-07-12 10:51:59.977070] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f328a0 00:25:24.915 [2024-07-12 10:51:59.977080] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:24.915 [2024-07-12 10:51:59.977285] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1face10 00:25:24.915 [2024-07-12 10:51:59.977428] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f328a0 00:25:24.915 [2024-07-12 10:51:59.977439] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f328a0 00:25:24.915 [2024-07-12 10:51:59.977559] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.915 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.174 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:25.174 "name": "raid_bdev1", 00:25:25.174 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:25.174 "strip_size_kb": 0, 00:25:25.174 "state": "online", 00:25:25.174 "raid_level": "raid1", 00:25:25.174 "superblock": false, 00:25:25.174 "num_base_bdevs": 4, 00:25:25.174 "num_base_bdevs_discovered": 4, 00:25:25.174 "num_base_bdevs_operational": 4, 00:25:25.174 "base_bdevs_list": [ 00:25:25.174 { 00:25:25.174 "name": "BaseBdev1", 00:25:25.174 "uuid": "a0a324f3-60e4-5796-a701-1be8f4e672ce", 00:25:25.174 "is_configured": true, 00:25:25.174 "data_offset": 0, 00:25:25.174 "data_size": 65536 00:25:25.174 }, 00:25:25.174 { 00:25:25.174 "name": "BaseBdev2", 00:25:25.174 "uuid": "0eb6ff39-b560-580f-924e-20ac42ca6d81", 00:25:25.174 "is_configured": true, 00:25:25.174 "data_offset": 0, 00:25:25.174 "data_size": 65536 00:25:25.174 }, 00:25:25.174 { 00:25:25.174 "name": "BaseBdev3", 00:25:25.174 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:25.174 "is_configured": true, 00:25:25.174 "data_offset": 0, 00:25:25.174 "data_size": 65536 00:25:25.174 }, 00:25:25.174 { 00:25:25.174 "name": "BaseBdev4", 00:25:25.174 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:25.174 "is_configured": true, 00:25:25.174 "data_offset": 0, 00:25:25.174 "data_size": 65536 00:25:25.174 } 00:25:25.174 ] 00:25:25.174 }' 00:25:25.174 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:25.174 10:52:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:25.742 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:25.742 10:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:26.001 [2024-07-12 10:52:01.010747] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:26.001 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:26.001 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.001 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:26.261 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:26.261 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:26.261 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:26.261 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:26.261 [2024-07-12 10:52:01.297300] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f38970 00:25:26.261 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:26.261 Zero copy mechanism will not be used. 00:25:26.261 Running I/O for 60 seconds... 00:25:26.261 [2024-07-12 10:52:01.449076] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:26.261 [2024-07-12 10:52:01.449253] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1f38970 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.561 "name": "raid_bdev1", 00:25:26.561 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:26.561 "strip_size_kb": 0, 00:25:26.561 "state": "online", 00:25:26.561 "raid_level": "raid1", 00:25:26.561 "superblock": false, 00:25:26.561 "num_base_bdevs": 4, 00:25:26.561 "num_base_bdevs_discovered": 3, 00:25:26.561 "num_base_bdevs_operational": 3, 00:25:26.561 "base_bdevs_list": [ 00:25:26.561 { 00:25:26.561 "name": null, 00:25:26.561 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.561 "is_configured": false, 00:25:26.561 "data_offset": 0, 00:25:26.561 "data_size": 65536 00:25:26.561 }, 00:25:26.561 { 00:25:26.561 "name": "BaseBdev2", 00:25:26.561 "uuid": "0eb6ff39-b560-580f-924e-20ac42ca6d81", 00:25:26.561 "is_configured": true, 00:25:26.561 "data_offset": 0, 00:25:26.561 "data_size": 65536 00:25:26.561 }, 00:25:26.561 { 00:25:26.561 "name": "BaseBdev3", 00:25:26.561 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:26.561 "is_configured": true, 00:25:26.561 "data_offset": 0, 00:25:26.561 "data_size": 65536 00:25:26.561 }, 00:25:26.561 { 00:25:26.561 "name": "BaseBdev4", 00:25:26.561 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:26.561 "is_configured": true, 00:25:26.561 "data_offset": 0, 00:25:26.561 "data_size": 65536 00:25:26.561 } 00:25:26.561 ] 00:25:26.561 }' 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.561 10:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:27.497 10:52:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:27.497 [2024-07-12 10:52:02.606843] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:27.497 10:52:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:27.756 [2024-07-12 10:52:02.700527] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b08fa0 00:25:27.756 [2024-07-12 10:52:02.702935] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:27.756 [2024-07-12 10:52:02.824531] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:27.756 [2024-07-12 10:52:02.824844] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:28.014 [2024-07-12 10:52:03.047854] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:28.014 [2024-07-12 10:52:03.048580] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:28.273 [2024-07-12 10:52:03.429244] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:28.532 [2024-07-12 10:52:03.674859] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:28.532 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:28.532 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:28.532 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:28.532 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:28.532 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:28.532 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.532 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.791 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:28.791 "name": "raid_bdev1", 00:25:28.791 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:28.791 "strip_size_kb": 0, 00:25:28.791 "state": "online", 00:25:28.791 "raid_level": "raid1", 00:25:28.791 "superblock": false, 00:25:28.791 "num_base_bdevs": 4, 00:25:28.791 "num_base_bdevs_discovered": 4, 00:25:28.791 "num_base_bdevs_operational": 4, 00:25:28.791 "process": { 00:25:28.791 "type": "rebuild", 00:25:28.791 "target": "spare", 00:25:28.791 "progress": { 00:25:28.791 "blocks": 12288, 00:25:28.791 "percent": 18 00:25:28.791 } 00:25:28.791 }, 00:25:28.791 "base_bdevs_list": [ 00:25:28.791 { 00:25:28.791 "name": "spare", 00:25:28.791 "uuid": "8078dc32-b633-53a3-ab28-4e5450e11681", 00:25:28.791 "is_configured": true, 00:25:28.791 "data_offset": 0, 00:25:28.791 "data_size": 65536 00:25:28.791 }, 00:25:28.791 { 00:25:28.791 "name": "BaseBdev2", 00:25:28.791 "uuid": "0eb6ff39-b560-580f-924e-20ac42ca6d81", 00:25:28.791 "is_configured": true, 00:25:28.791 "data_offset": 0, 00:25:28.791 "data_size": 65536 00:25:28.791 }, 00:25:28.791 { 00:25:28.791 "name": "BaseBdev3", 00:25:28.791 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:28.791 "is_configured": true, 00:25:28.791 "data_offset": 0, 00:25:28.791 "data_size": 65536 00:25:28.791 }, 00:25:28.791 { 00:25:28.791 "name": "BaseBdev4", 00:25:28.791 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:28.791 "is_configured": true, 00:25:28.791 "data_offset": 0, 00:25:28.791 "data_size": 65536 00:25:28.791 } 00:25:28.791 ] 00:25:28.791 }' 00:25:28.791 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.049 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:29.049 10:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.049 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:29.049 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:29.049 [2024-07-12 10:52:04.120265] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:29.308 [2024-07-12 10:52:04.266983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:29.308 [2024-07-12 10:52:04.437588] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:29.308 [2024-07-12 10:52:04.448311] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:29.308 [2024-07-12 10:52:04.448347] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:29.308 [2024-07-12 10:52:04.448358] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:29.308 [2024-07-12 10:52:04.454850] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1f38970 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.308 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.566 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:29.566 "name": "raid_bdev1", 00:25:29.566 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:29.566 "strip_size_kb": 0, 00:25:29.566 "state": "online", 00:25:29.566 "raid_level": "raid1", 00:25:29.566 "superblock": false, 00:25:29.566 "num_base_bdevs": 4, 00:25:29.566 "num_base_bdevs_discovered": 3, 00:25:29.566 "num_base_bdevs_operational": 3, 00:25:29.566 "base_bdevs_list": [ 00:25:29.566 { 00:25:29.566 "name": null, 00:25:29.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.566 "is_configured": false, 00:25:29.566 "data_offset": 0, 00:25:29.566 "data_size": 65536 00:25:29.566 }, 00:25:29.566 { 00:25:29.566 "name": "BaseBdev2", 00:25:29.566 "uuid": "0eb6ff39-b560-580f-924e-20ac42ca6d81", 00:25:29.566 "is_configured": true, 00:25:29.566 "data_offset": 0, 00:25:29.566 "data_size": 65536 00:25:29.566 }, 00:25:29.566 { 00:25:29.566 "name": "BaseBdev3", 00:25:29.566 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:29.566 "is_configured": true, 00:25:29.566 "data_offset": 0, 00:25:29.566 "data_size": 65536 00:25:29.566 }, 00:25:29.566 { 00:25:29.566 "name": "BaseBdev4", 00:25:29.566 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:29.566 "is_configured": true, 00:25:29.566 "data_offset": 0, 00:25:29.566 "data_size": 65536 00:25:29.566 } 00:25:29.566 ] 00:25:29.566 }' 00:25:29.566 10:52:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:29.823 10:52:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:30.386 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:30.387 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:30.387 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:30.387 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:30.387 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:30.387 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.387 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.644 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:30.644 "name": "raid_bdev1", 00:25:30.644 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:30.644 "strip_size_kb": 0, 00:25:30.644 "state": "online", 00:25:30.644 "raid_level": "raid1", 00:25:30.644 "superblock": false, 00:25:30.644 "num_base_bdevs": 4, 00:25:30.644 "num_base_bdevs_discovered": 3, 00:25:30.644 "num_base_bdevs_operational": 3, 00:25:30.644 "base_bdevs_list": [ 00:25:30.644 { 00:25:30.644 "name": null, 00:25:30.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.644 "is_configured": false, 00:25:30.644 "data_offset": 0, 00:25:30.644 "data_size": 65536 00:25:30.644 }, 00:25:30.644 { 00:25:30.644 "name": "BaseBdev2", 00:25:30.644 "uuid": "0eb6ff39-b560-580f-924e-20ac42ca6d81", 00:25:30.644 "is_configured": true, 00:25:30.644 "data_offset": 0, 00:25:30.644 "data_size": 65536 00:25:30.644 }, 00:25:30.644 { 00:25:30.644 "name": "BaseBdev3", 00:25:30.644 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:30.644 "is_configured": true, 00:25:30.644 "data_offset": 0, 00:25:30.644 "data_size": 65536 00:25:30.644 }, 00:25:30.644 { 00:25:30.644 "name": "BaseBdev4", 00:25:30.644 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:30.644 "is_configured": true, 00:25:30.644 "data_offset": 0, 00:25:30.644 "data_size": 65536 00:25:30.644 } 00:25:30.644 ] 00:25:30.644 }' 00:25:30.644 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:30.644 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:30.644 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:30.644 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:30.644 10:52:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:30.902 [2024-07-12 10:52:05.951889] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:30.902 [2024-07-12 10:52:06.000589] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f35270 00:25:30.902 [2024-07-12 10:52:06.002138] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:30.902 10:52:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:31.160 [2024-07-12 10:52:06.115205] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:31.160 [2024-07-12 10:52:06.244611] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:31.160 [2024-07-12 10:52:06.244866] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:31.418 [2024-07-12 10:52:06.599416] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:31.418 [2024-07-12 10:52:06.599960] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:31.676 [2024-07-12 10:52:06.823446] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:31.676 [2024-07-12 10:52:06.824095] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:31.933 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:31.933 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.933 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:31.933 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:31.933 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.933 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.933 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.191 [2024-07-12 10:52:07.169162] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:32.191 [2024-07-12 10:52:07.170278] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:32.191 "name": "raid_bdev1", 00:25:32.191 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:32.191 "strip_size_kb": 0, 00:25:32.191 "state": "online", 00:25:32.191 "raid_level": "raid1", 00:25:32.191 "superblock": false, 00:25:32.191 "num_base_bdevs": 4, 00:25:32.191 "num_base_bdevs_discovered": 4, 00:25:32.191 "num_base_bdevs_operational": 4, 00:25:32.191 "process": { 00:25:32.191 "type": "rebuild", 00:25:32.191 "target": "spare", 00:25:32.191 "progress": { 00:25:32.191 "blocks": 14336, 00:25:32.191 "percent": 21 00:25:32.191 } 00:25:32.191 }, 00:25:32.191 "base_bdevs_list": [ 00:25:32.191 { 00:25:32.191 "name": "spare", 00:25:32.191 "uuid": "8078dc32-b633-53a3-ab28-4e5450e11681", 00:25:32.191 "is_configured": true, 00:25:32.191 "data_offset": 0, 00:25:32.191 "data_size": 65536 00:25:32.191 }, 00:25:32.191 { 00:25:32.191 "name": "BaseBdev2", 00:25:32.191 "uuid": "0eb6ff39-b560-580f-924e-20ac42ca6d81", 00:25:32.191 "is_configured": true, 00:25:32.191 "data_offset": 0, 00:25:32.191 "data_size": 65536 00:25:32.191 }, 00:25:32.191 { 00:25:32.191 "name": "BaseBdev3", 00:25:32.191 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:32.191 "is_configured": true, 00:25:32.191 "data_offset": 0, 00:25:32.191 "data_size": 65536 00:25:32.191 }, 00:25:32.191 { 00:25:32.191 "name": "BaseBdev4", 00:25:32.191 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:32.191 "is_configured": true, 00:25:32.191 "data_offset": 0, 00:25:32.191 "data_size": 65536 00:25:32.191 } 00:25:32.191 ] 00:25:32.191 }' 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:32.191 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:32.191 [2024-07-12 10:52:07.383296] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:32.448 [2024-07-12 10:52:07.590866] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:32.448 [2024-07-12 10:52:07.607404] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:32.706 [2024-07-12 10:52:07.663020] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1f38970 00:25:32.706 [2024-07-12 10:52:07.663047] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1f35270 00:25:32.706 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:32.706 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:32.706 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:32.706 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.706 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:32.706 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:32.706 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.706 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.706 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.964 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:32.964 "name": "raid_bdev1", 00:25:32.964 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:32.964 "strip_size_kb": 0, 00:25:32.964 "state": "online", 00:25:32.964 "raid_level": "raid1", 00:25:32.964 "superblock": false, 00:25:32.964 "num_base_bdevs": 4, 00:25:32.964 "num_base_bdevs_discovered": 3, 00:25:32.964 "num_base_bdevs_operational": 3, 00:25:32.964 "process": { 00:25:32.964 "type": "rebuild", 00:25:32.964 "target": "spare", 00:25:32.964 "progress": { 00:25:32.964 "blocks": 24576, 00:25:32.964 "percent": 37 00:25:32.964 } 00:25:32.964 }, 00:25:32.964 "base_bdevs_list": [ 00:25:32.964 { 00:25:32.964 "name": "spare", 00:25:32.964 "uuid": "8078dc32-b633-53a3-ab28-4e5450e11681", 00:25:32.964 "is_configured": true, 00:25:32.964 "data_offset": 0, 00:25:32.964 "data_size": 65536 00:25:32.964 }, 00:25:32.964 { 00:25:32.964 "name": null, 00:25:32.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.964 "is_configured": false, 00:25:32.964 "data_offset": 0, 00:25:32.964 "data_size": 65536 00:25:32.964 }, 00:25:32.964 { 00:25:32.964 "name": "BaseBdev3", 00:25:32.964 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:32.964 "is_configured": true, 00:25:32.964 "data_offset": 0, 00:25:32.964 "data_size": 65536 00:25:32.964 }, 00:25:32.964 { 00:25:32.964 "name": "BaseBdev4", 00:25:32.964 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:32.964 "is_configured": true, 00:25:32.964 "data_offset": 0, 00:25:32.964 "data_size": 65536 00:25:32.964 } 00:25:32.964 ] 00:25:32.964 }' 00:25:32.964 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:32.964 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:32.964 10:52:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=924 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.964 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.964 [2024-07-12 10:52:08.136972] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:33.222 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.222 "name": "raid_bdev1", 00:25:33.222 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:33.222 "strip_size_kb": 0, 00:25:33.222 "state": "online", 00:25:33.222 "raid_level": "raid1", 00:25:33.222 "superblock": false, 00:25:33.222 "num_base_bdevs": 4, 00:25:33.222 "num_base_bdevs_discovered": 3, 00:25:33.222 "num_base_bdevs_operational": 3, 00:25:33.222 "process": { 00:25:33.222 "type": "rebuild", 00:25:33.222 "target": "spare", 00:25:33.222 "progress": { 00:25:33.222 "blocks": 28672, 00:25:33.222 "percent": 43 00:25:33.222 } 00:25:33.222 }, 00:25:33.222 "base_bdevs_list": [ 00:25:33.222 { 00:25:33.222 "name": "spare", 00:25:33.222 "uuid": "8078dc32-b633-53a3-ab28-4e5450e11681", 00:25:33.222 "is_configured": true, 00:25:33.222 "data_offset": 0, 00:25:33.222 "data_size": 65536 00:25:33.222 }, 00:25:33.222 { 00:25:33.222 "name": null, 00:25:33.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.222 "is_configured": false, 00:25:33.222 "data_offset": 0, 00:25:33.222 "data_size": 65536 00:25:33.222 }, 00:25:33.222 { 00:25:33.222 "name": "BaseBdev3", 00:25:33.222 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:33.222 "is_configured": true, 00:25:33.222 "data_offset": 0, 00:25:33.222 "data_size": 65536 00:25:33.222 }, 00:25:33.222 { 00:25:33.222 "name": "BaseBdev4", 00:25:33.222 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:33.222 "is_configured": true, 00:25:33.222 "data_offset": 0, 00:25:33.222 "data_size": 65536 00:25:33.222 } 00:25:33.222 ] 00:25:33.222 }' 00:25:33.222 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:33.222 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:33.222 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:33.222 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:33.222 10:52:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:33.788 [2024-07-12 10:52:08.941719] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:25:34.353 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:34.353 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:34.353 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:34.353 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:34.353 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:34.353 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:34.353 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.353 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.611 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:34.611 "name": "raid_bdev1", 00:25:34.611 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:34.611 "strip_size_kb": 0, 00:25:34.611 "state": "online", 00:25:34.611 "raid_level": "raid1", 00:25:34.611 "superblock": false, 00:25:34.611 "num_base_bdevs": 4, 00:25:34.611 "num_base_bdevs_discovered": 3, 00:25:34.611 "num_base_bdevs_operational": 3, 00:25:34.611 "process": { 00:25:34.611 "type": "rebuild", 00:25:34.611 "target": "spare", 00:25:34.611 "progress": { 00:25:34.611 "blocks": 53248, 00:25:34.611 "percent": 81 00:25:34.611 } 00:25:34.611 }, 00:25:34.611 "base_bdevs_list": [ 00:25:34.611 { 00:25:34.611 "name": "spare", 00:25:34.611 "uuid": "8078dc32-b633-53a3-ab28-4e5450e11681", 00:25:34.611 "is_configured": true, 00:25:34.611 "data_offset": 0, 00:25:34.611 "data_size": 65536 00:25:34.611 }, 00:25:34.611 { 00:25:34.611 "name": null, 00:25:34.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.611 "is_configured": false, 00:25:34.611 "data_offset": 0, 00:25:34.611 "data_size": 65536 00:25:34.611 }, 00:25:34.611 { 00:25:34.611 "name": "BaseBdev3", 00:25:34.611 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:34.611 "is_configured": true, 00:25:34.611 "data_offset": 0, 00:25:34.611 "data_size": 65536 00:25:34.611 }, 00:25:34.611 { 00:25:34.611 "name": "BaseBdev4", 00:25:34.611 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:34.611 "is_configured": true, 00:25:34.611 "data_offset": 0, 00:25:34.611 "data_size": 65536 00:25:34.611 } 00:25:34.611 ] 00:25:34.611 }' 00:25:34.611 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:34.611 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:34.611 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:34.611 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:34.611 10:52:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:34.868 [2024-07-12 10:52:09.823406] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:25:35.126 [2024-07-12 10:52:10.276445] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:35.384 [2024-07-12 10:52:10.376684] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:35.384 [2024-07-12 10:52:10.379028] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:35.641 10:52:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:35.641 10:52:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.641 10:52:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.641 10:52:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:35.641 10:52:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:35.641 10:52:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.641 10:52:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.641 10:52:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.899 10:52:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.899 "name": "raid_bdev1", 00:25:35.899 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:35.899 "strip_size_kb": 0, 00:25:35.899 "state": "online", 00:25:35.899 "raid_level": "raid1", 00:25:35.899 "superblock": false, 00:25:35.899 "num_base_bdevs": 4, 00:25:35.899 "num_base_bdevs_discovered": 3, 00:25:35.899 "num_base_bdevs_operational": 3, 00:25:35.899 "base_bdevs_list": [ 00:25:35.899 { 00:25:35.899 "name": "spare", 00:25:35.899 "uuid": "8078dc32-b633-53a3-ab28-4e5450e11681", 00:25:35.899 "is_configured": true, 00:25:35.899 "data_offset": 0, 00:25:35.899 "data_size": 65536 00:25:35.899 }, 00:25:35.899 { 00:25:35.899 "name": null, 00:25:35.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.899 "is_configured": false, 00:25:35.899 "data_offset": 0, 00:25:35.899 "data_size": 65536 00:25:35.899 }, 00:25:35.899 { 00:25:35.899 "name": "BaseBdev3", 00:25:35.899 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:35.899 "is_configured": true, 00:25:35.899 "data_offset": 0, 00:25:35.899 "data_size": 65536 00:25:35.899 }, 00:25:35.899 { 00:25:35.899 "name": "BaseBdev4", 00:25:35.899 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:35.899 "is_configured": true, 00:25:35.899 "data_offset": 0, 00:25:35.899 "data_size": 65536 00:25:35.899 } 00:25:35.899 ] 00:25:35.899 }' 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:35.899 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:36.157 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.157 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.157 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:36.157 "name": "raid_bdev1", 00:25:36.157 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:36.157 "strip_size_kb": 0, 00:25:36.157 "state": "online", 00:25:36.157 "raid_level": "raid1", 00:25:36.157 "superblock": false, 00:25:36.157 "num_base_bdevs": 4, 00:25:36.157 "num_base_bdevs_discovered": 3, 00:25:36.157 "num_base_bdevs_operational": 3, 00:25:36.157 "base_bdevs_list": [ 00:25:36.157 { 00:25:36.157 "name": "spare", 00:25:36.157 "uuid": "8078dc32-b633-53a3-ab28-4e5450e11681", 00:25:36.157 "is_configured": true, 00:25:36.157 "data_offset": 0, 00:25:36.157 "data_size": 65536 00:25:36.157 }, 00:25:36.157 { 00:25:36.157 "name": null, 00:25:36.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.157 "is_configured": false, 00:25:36.157 "data_offset": 0, 00:25:36.157 "data_size": 65536 00:25:36.157 }, 00:25:36.157 { 00:25:36.157 "name": "BaseBdev3", 00:25:36.157 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:36.157 "is_configured": true, 00:25:36.157 "data_offset": 0, 00:25:36.157 "data_size": 65536 00:25:36.157 }, 00:25:36.157 { 00:25:36.157 "name": "BaseBdev4", 00:25:36.157 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:36.157 "is_configured": true, 00:25:36.157 "data_offset": 0, 00:25:36.157 "data_size": 65536 00:25:36.157 } 00:25:36.157 ] 00:25:36.157 }' 00:25:36.157 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.453 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.711 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.711 "name": "raid_bdev1", 00:25:36.711 "uuid": "2d5eddfb-a8cf-4f30-a529-256a23b848de", 00:25:36.711 "strip_size_kb": 0, 00:25:36.711 "state": "online", 00:25:36.711 "raid_level": "raid1", 00:25:36.711 "superblock": false, 00:25:36.711 "num_base_bdevs": 4, 00:25:36.711 "num_base_bdevs_discovered": 3, 00:25:36.711 "num_base_bdevs_operational": 3, 00:25:36.711 "base_bdevs_list": [ 00:25:36.711 { 00:25:36.711 "name": "spare", 00:25:36.711 "uuid": "8078dc32-b633-53a3-ab28-4e5450e11681", 00:25:36.711 "is_configured": true, 00:25:36.711 "data_offset": 0, 00:25:36.711 "data_size": 65536 00:25:36.711 }, 00:25:36.711 { 00:25:36.711 "name": null, 00:25:36.711 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.711 "is_configured": false, 00:25:36.711 "data_offset": 0, 00:25:36.711 "data_size": 65536 00:25:36.711 }, 00:25:36.711 { 00:25:36.711 "name": "BaseBdev3", 00:25:36.711 "uuid": "63d45b56-2219-5910-a320-91fefb8fac3c", 00:25:36.711 "is_configured": true, 00:25:36.711 "data_offset": 0, 00:25:36.711 "data_size": 65536 00:25:36.711 }, 00:25:36.711 { 00:25:36.711 "name": "BaseBdev4", 00:25:36.711 "uuid": "7791d543-5cc2-5a4e-8a1d-f396daab1b02", 00:25:36.711 "is_configured": true, 00:25:36.711 "data_offset": 0, 00:25:36.711 "data_size": 65536 00:25:36.711 } 00:25:36.711 ] 00:25:36.711 }' 00:25:36.711 10:52:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.711 10:52:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:37.277 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:37.535 [2024-07-12 10:52:12.493529] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:37.535 [2024-07-12 10:52:12.493566] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:37.535 00:25:37.535 Latency(us) 00:25:37.535 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:37.535 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:37.535 raid_bdev1 : 11.26 97.49 292.47 0.00 0.00 14099.82 290.28 123093.70 00:25:37.535 =================================================================================================================== 00:25:37.535 Total : 97.49 292.47 0.00 0.00 14099.82 290.28 123093.70 00:25:37.535 [2024-07-12 10:52:12.593756] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:37.535 [2024-07-12 10:52:12.593784] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:37.535 [2024-07-12 10:52:12.593878] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:37.535 [2024-07-12 10:52:12.593890] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f328a0 name raid_bdev1, state offline 00:25:37.535 0 00:25:37.535 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.535 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:37.793 10:52:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:38.050 /dev/nbd0 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:38.050 1+0 records in 00:25:38.050 1+0 records out 00:25:38.050 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274397 s, 14.9 MB/s 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:38.050 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:38.051 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:38.308 /dev/nbd1 00:25:38.308 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:38.308 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:38.308 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:38.308 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:38.308 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:38.309 1+0 records in 00:25:38.309 1+0 records out 00:25:38.309 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217038 s, 18.9 MB/s 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:38.309 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:38.566 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:38.566 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:38.566 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:38.566 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:38.566 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:38.566 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:38.566 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:38.825 10:52:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:39.083 /dev/nbd1 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:39.083 1+0 records in 00:25:39.083 1+0 records out 00:25:39.083 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279067 s, 14.7 MB/s 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:39.083 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:39.341 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2145599 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2145599 ']' 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2145599 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2145599 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2145599' 00:25:39.599 killing process with pid 2145599 00:25:39.599 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2145599 00:25:39.599 Received shutdown signal, test time was about 13.447596 seconds 00:25:39.600 00:25:39.600 Latency(us) 00:25:39.600 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:39.600 =================================================================================================================== 00:25:39.600 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:39.600 [2024-07-12 10:52:14.780125] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:39.600 10:52:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2145599 00:25:39.869 [2024-07-12 10:52:14.824935] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:40.142 00:25:40.142 real 0m20.593s 00:25:40.142 user 0m32.343s 00:25:40.142 sys 0m3.710s 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:40.142 ************************************ 00:25:40.142 END TEST raid_rebuild_test_io 00:25:40.142 ************************************ 00:25:40.142 10:52:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:40.142 10:52:15 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:25:40.142 10:52:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:40.142 10:52:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:40.142 10:52:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:40.142 ************************************ 00:25:40.142 START TEST raid_rebuild_test_sb_io 00:25:40.142 ************************************ 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2148488 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2148488 /var/tmp/spdk-raid.sock 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2148488 ']' 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:40.142 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:40.142 10:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:40.142 [2024-07-12 10:52:15.206029] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:25:40.142 [2024-07-12 10:52:15.206094] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2148488 ] 00:25:40.142 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:40.142 Zero copy mechanism will not be used. 00:25:40.142 [2024-07-12 10:52:15.323709] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:40.400 [2024-07-12 10:52:15.427999] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:40.400 [2024-07-12 10:52:15.488442] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:40.400 [2024-07-12 10:52:15.488500] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:40.966 10:52:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:40.966 10:52:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:25:40.966 10:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:40.966 10:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:41.224 BaseBdev1_malloc 00:25:41.224 10:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:41.791 [2024-07-12 10:52:16.868406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:41.791 [2024-07-12 10:52:16.868454] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:41.791 [2024-07-12 10:52:16.868486] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9dfd40 00:25:41.791 [2024-07-12 10:52:16.868500] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:41.791 [2024-07-12 10:52:16.870259] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:41.791 [2024-07-12 10:52:16.870288] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:41.791 BaseBdev1 00:25:41.791 10:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:41.791 10:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:42.050 BaseBdev2_malloc 00:25:42.050 10:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:42.309 [2024-07-12 10:52:17.375857] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:42.309 [2024-07-12 10:52:17.375907] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:42.309 [2024-07-12 10:52:17.375936] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9e0860 00:25:42.309 [2024-07-12 10:52:17.375949] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:42.309 [2024-07-12 10:52:17.377589] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:42.309 [2024-07-12 10:52:17.377617] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:42.309 BaseBdev2 00:25:42.309 10:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:42.309 10:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:42.568 BaseBdev3_malloc 00:25:42.568 10:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:42.568 [2024-07-12 10:52:17.725324] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:42.568 [2024-07-12 10:52:17.725366] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:42.568 [2024-07-12 10:52:17.725391] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb8d8f0 00:25:42.568 [2024-07-12 10:52:17.725404] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:42.568 [2024-07-12 10:52:17.726805] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:42.568 [2024-07-12 10:52:17.726832] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:42.568 BaseBdev3 00:25:42.568 10:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:42.568 10:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:42.828 BaseBdev4_malloc 00:25:42.828 10:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:43.086 [2024-07-12 10:52:18.082818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:43.086 [2024-07-12 10:52:18.082859] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:43.086 [2024-07-12 10:52:18.082879] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb8cad0 00:25:43.086 [2024-07-12 10:52:18.082891] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:43.086 [2024-07-12 10:52:18.084296] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:43.086 [2024-07-12 10:52:18.084323] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:43.086 BaseBdev4 00:25:43.086 10:52:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:43.345 spare_malloc 00:25:43.345 10:52:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:43.604 spare_delay 00:25:43.604 10:52:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:43.864 [2024-07-12 10:52:18.841453] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:43.864 [2024-07-12 10:52:18.841503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:43.864 [2024-07-12 10:52:18.841525] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb915b0 00:25:43.864 [2024-07-12 10:52:18.841538] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:43.864 [2024-07-12 10:52:18.843155] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:43.864 [2024-07-12 10:52:18.843183] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:43.864 spare 00:25:43.864 10:52:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:44.123 [2024-07-12 10:52:19.082125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:44.123 [2024-07-12 10:52:19.083462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:44.123 [2024-07-12 10:52:19.083526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:44.123 [2024-07-12 10:52:19.083572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:44.123 [2024-07-12 10:52:19.083772] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb108a0 00:25:44.123 [2024-07-12 10:52:19.083783] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:44.123 [2024-07-12 10:52:19.083990] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb8ae10 00:25:44.123 [2024-07-12 10:52:19.084141] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb108a0 00:25:44.123 [2024-07-12 10:52:19.084156] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb108a0 00:25:44.123 [2024-07-12 10:52:19.084255] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.123 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.383 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:44.383 "name": "raid_bdev1", 00:25:44.383 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:44.383 "strip_size_kb": 0, 00:25:44.383 "state": "online", 00:25:44.383 "raid_level": "raid1", 00:25:44.383 "superblock": true, 00:25:44.383 "num_base_bdevs": 4, 00:25:44.383 "num_base_bdevs_discovered": 4, 00:25:44.383 "num_base_bdevs_operational": 4, 00:25:44.383 "base_bdevs_list": [ 00:25:44.383 { 00:25:44.383 "name": "BaseBdev1", 00:25:44.383 "uuid": "ad425746-723b-5b6e-a8ce-726ba3cd57fd", 00:25:44.383 "is_configured": true, 00:25:44.383 "data_offset": 2048, 00:25:44.383 "data_size": 63488 00:25:44.383 }, 00:25:44.383 { 00:25:44.383 "name": "BaseBdev2", 00:25:44.383 "uuid": "7d7cf3ce-44af-5d0c-9591-b47faeddacca", 00:25:44.383 "is_configured": true, 00:25:44.383 "data_offset": 2048, 00:25:44.383 "data_size": 63488 00:25:44.383 }, 00:25:44.383 { 00:25:44.383 "name": "BaseBdev3", 00:25:44.383 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:44.383 "is_configured": true, 00:25:44.383 "data_offset": 2048, 00:25:44.383 "data_size": 63488 00:25:44.383 }, 00:25:44.383 { 00:25:44.383 "name": "BaseBdev4", 00:25:44.383 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:44.383 "is_configured": true, 00:25:44.383 "data_offset": 2048, 00:25:44.383 "data_size": 63488 00:25:44.383 } 00:25:44.383 ] 00:25:44.383 }' 00:25:44.383 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:44.383 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:44.950 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:44.950 10:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:45.209 [2024-07-12 10:52:20.181322] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:45.209 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:45.209 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.209 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:45.467 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:45.467 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:45.467 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:45.467 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:45.467 [2024-07-12 10:52:20.560238] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9df670 00:25:45.467 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:45.467 Zero copy mechanism will not be used. 00:25:45.467 Running I/O for 60 seconds... 00:25:45.726 [2024-07-12 10:52:20.683633] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:45.726 [2024-07-12 10:52:20.691859] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x9df670 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.726 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.986 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:45.986 "name": "raid_bdev1", 00:25:45.986 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:45.986 "strip_size_kb": 0, 00:25:45.986 "state": "online", 00:25:45.986 "raid_level": "raid1", 00:25:45.986 "superblock": true, 00:25:45.986 "num_base_bdevs": 4, 00:25:45.986 "num_base_bdevs_discovered": 3, 00:25:45.986 "num_base_bdevs_operational": 3, 00:25:45.986 "base_bdevs_list": [ 00:25:45.986 { 00:25:45.986 "name": null, 00:25:45.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.986 "is_configured": false, 00:25:45.986 "data_offset": 2048, 00:25:45.986 "data_size": 63488 00:25:45.986 }, 00:25:45.986 { 00:25:45.986 "name": "BaseBdev2", 00:25:45.986 "uuid": "7d7cf3ce-44af-5d0c-9591-b47faeddacca", 00:25:45.986 "is_configured": true, 00:25:45.986 "data_offset": 2048, 00:25:45.986 "data_size": 63488 00:25:45.986 }, 00:25:45.986 { 00:25:45.986 "name": "BaseBdev3", 00:25:45.986 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:45.986 "is_configured": true, 00:25:45.986 "data_offset": 2048, 00:25:45.986 "data_size": 63488 00:25:45.986 }, 00:25:45.986 { 00:25:45.986 "name": "BaseBdev4", 00:25:45.986 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:45.986 "is_configured": true, 00:25:45.986 "data_offset": 2048, 00:25:45.986 "data_size": 63488 00:25:45.986 } 00:25:45.986 ] 00:25:45.986 }' 00:25:45.986 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:45.986 10:52:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:46.554 10:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:46.812 [2024-07-12 10:52:21.819338] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:46.812 10:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:46.812 [2024-07-12 10:52:21.911695] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb12b40 00:25:46.812 [2024-07-12 10:52:21.914066] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:47.071 [2024-07-12 10:52:22.032179] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:47.330 [2024-07-12 10:52:22.283460] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:47.589 [2024-07-12 10:52:22.546808] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:47.589 [2024-07-12 10:52:22.547936] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:47.589 [2024-07-12 10:52:22.760770] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:47.589 [2024-07-12 10:52:22.760928] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:47.848 10:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:47.848 10:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:47.848 10:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:47.848 10:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:47.848 10:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:47.848 10:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.848 10:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.848 [2024-07-12 10:52:23.017245] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:48.106 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:48.106 "name": "raid_bdev1", 00:25:48.106 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:48.106 "strip_size_kb": 0, 00:25:48.106 "state": "online", 00:25:48.106 "raid_level": "raid1", 00:25:48.106 "superblock": true, 00:25:48.106 "num_base_bdevs": 4, 00:25:48.106 "num_base_bdevs_discovered": 4, 00:25:48.106 "num_base_bdevs_operational": 4, 00:25:48.106 "process": { 00:25:48.106 "type": "rebuild", 00:25:48.106 "target": "spare", 00:25:48.106 "progress": { 00:25:48.106 "blocks": 14336, 00:25:48.106 "percent": 22 00:25:48.106 } 00:25:48.106 }, 00:25:48.106 "base_bdevs_list": [ 00:25:48.106 { 00:25:48.106 "name": "spare", 00:25:48.106 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:25:48.106 "is_configured": true, 00:25:48.106 "data_offset": 2048, 00:25:48.106 "data_size": 63488 00:25:48.106 }, 00:25:48.106 { 00:25:48.106 "name": "BaseBdev2", 00:25:48.106 "uuid": "7d7cf3ce-44af-5d0c-9591-b47faeddacca", 00:25:48.106 "is_configured": true, 00:25:48.106 "data_offset": 2048, 00:25:48.106 "data_size": 63488 00:25:48.106 }, 00:25:48.106 { 00:25:48.106 "name": "BaseBdev3", 00:25:48.106 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:48.106 "is_configured": true, 00:25:48.106 "data_offset": 2048, 00:25:48.106 "data_size": 63488 00:25:48.106 }, 00:25:48.106 { 00:25:48.106 "name": "BaseBdev4", 00:25:48.106 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:48.106 "is_configured": true, 00:25:48.106 "data_offset": 2048, 00:25:48.106 "data_size": 63488 00:25:48.106 } 00:25:48.106 ] 00:25:48.106 }' 00:25:48.106 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:48.106 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:48.106 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:48.106 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:48.106 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:48.106 [2024-07-12 10:52:23.252513] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:48.364 [2024-07-12 10:52:23.406764] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:48.364 [2024-07-12 10:52:23.498590] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:48.364 [2024-07-12 10:52:23.498883] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:48.364 [2024-07-12 10:52:23.507575] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:48.364 [2024-07-12 10:52:23.528644] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:48.364 [2024-07-12 10:52:23.528686] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:48.364 [2024-07-12 10:52:23.528697] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:48.623 [2024-07-12 10:52:23.560204] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x9df670 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.623 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.883 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:48.883 "name": "raid_bdev1", 00:25:48.883 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:48.883 "strip_size_kb": 0, 00:25:48.883 "state": "online", 00:25:48.883 "raid_level": "raid1", 00:25:48.883 "superblock": true, 00:25:48.883 "num_base_bdevs": 4, 00:25:48.883 "num_base_bdevs_discovered": 3, 00:25:48.883 "num_base_bdevs_operational": 3, 00:25:48.883 "base_bdevs_list": [ 00:25:48.883 { 00:25:48.883 "name": null, 00:25:48.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.883 "is_configured": false, 00:25:48.883 "data_offset": 2048, 00:25:48.883 "data_size": 63488 00:25:48.883 }, 00:25:48.883 { 00:25:48.883 "name": "BaseBdev2", 00:25:48.883 "uuid": "7d7cf3ce-44af-5d0c-9591-b47faeddacca", 00:25:48.883 "is_configured": true, 00:25:48.883 "data_offset": 2048, 00:25:48.883 "data_size": 63488 00:25:48.883 }, 00:25:48.883 { 00:25:48.883 "name": "BaseBdev3", 00:25:48.883 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:48.883 "is_configured": true, 00:25:48.883 "data_offset": 2048, 00:25:48.883 "data_size": 63488 00:25:48.883 }, 00:25:48.883 { 00:25:48.883 "name": "BaseBdev4", 00:25:48.883 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:48.883 "is_configured": true, 00:25:48.883 "data_offset": 2048, 00:25:48.883 "data_size": 63488 00:25:48.883 } 00:25:48.883 ] 00:25:48.883 }' 00:25:48.883 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:48.883 10:52:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:49.451 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:49.451 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:49.451 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:49.451 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:49.451 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:49.451 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.451 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.711 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:49.711 "name": "raid_bdev1", 00:25:49.711 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:49.711 "strip_size_kb": 0, 00:25:49.711 "state": "online", 00:25:49.711 "raid_level": "raid1", 00:25:49.711 "superblock": true, 00:25:49.711 "num_base_bdevs": 4, 00:25:49.711 "num_base_bdevs_discovered": 3, 00:25:49.711 "num_base_bdevs_operational": 3, 00:25:49.711 "base_bdevs_list": [ 00:25:49.711 { 00:25:49.711 "name": null, 00:25:49.711 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.711 "is_configured": false, 00:25:49.711 "data_offset": 2048, 00:25:49.711 "data_size": 63488 00:25:49.711 }, 00:25:49.711 { 00:25:49.711 "name": "BaseBdev2", 00:25:49.711 "uuid": "7d7cf3ce-44af-5d0c-9591-b47faeddacca", 00:25:49.711 "is_configured": true, 00:25:49.711 "data_offset": 2048, 00:25:49.711 "data_size": 63488 00:25:49.711 }, 00:25:49.711 { 00:25:49.711 "name": "BaseBdev3", 00:25:49.711 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:49.711 "is_configured": true, 00:25:49.711 "data_offset": 2048, 00:25:49.711 "data_size": 63488 00:25:49.711 }, 00:25:49.711 { 00:25:49.711 "name": "BaseBdev4", 00:25:49.711 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:49.711 "is_configured": true, 00:25:49.711 "data_offset": 2048, 00:25:49.711 "data_size": 63488 00:25:49.711 } 00:25:49.711 ] 00:25:49.711 }' 00:25:49.711 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:49.711 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:49.711 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:49.711 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:49.711 10:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:49.971 [2024-07-12 10:52:25.087673] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:49.971 [2024-07-12 10:52:25.153439] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb148f0 00:25:49.971 10:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:49.971 [2024-07-12 10:52:25.154992] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:50.230 [2024-07-12 10:52:25.263978] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:50.230 [2024-07-12 10:52:25.265277] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:50.490 [2024-07-12 10:52:25.470170] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:50.490 [2024-07-12 10:52:25.470885] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:50.749 [2024-07-12 10:52:25.785348] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:50.749 [2024-07-12 10:52:25.786538] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:51.008 [2024-07-12 10:52:26.009064] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:51.008 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:51.008 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:51.008 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:51.008 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:51.008 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:51.008 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.008 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.267 [2024-07-12 10:52:26.221899] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:51.267 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:51.267 "name": "raid_bdev1", 00:25:51.267 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:51.267 "strip_size_kb": 0, 00:25:51.267 "state": "online", 00:25:51.267 "raid_level": "raid1", 00:25:51.267 "superblock": true, 00:25:51.267 "num_base_bdevs": 4, 00:25:51.267 "num_base_bdevs_discovered": 4, 00:25:51.267 "num_base_bdevs_operational": 4, 00:25:51.267 "process": { 00:25:51.267 "type": "rebuild", 00:25:51.267 "target": "spare", 00:25:51.267 "progress": { 00:25:51.267 "blocks": 14336, 00:25:51.267 "percent": 22 00:25:51.267 } 00:25:51.267 }, 00:25:51.267 "base_bdevs_list": [ 00:25:51.267 { 00:25:51.267 "name": "spare", 00:25:51.267 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:25:51.267 "is_configured": true, 00:25:51.267 "data_offset": 2048, 00:25:51.267 "data_size": 63488 00:25:51.267 }, 00:25:51.267 { 00:25:51.267 "name": "BaseBdev2", 00:25:51.267 "uuid": "7d7cf3ce-44af-5d0c-9591-b47faeddacca", 00:25:51.267 "is_configured": true, 00:25:51.267 "data_offset": 2048, 00:25:51.267 "data_size": 63488 00:25:51.267 }, 00:25:51.267 { 00:25:51.267 "name": "BaseBdev3", 00:25:51.267 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:51.267 "is_configured": true, 00:25:51.267 "data_offset": 2048, 00:25:51.267 "data_size": 63488 00:25:51.267 }, 00:25:51.267 { 00:25:51.267 "name": "BaseBdev4", 00:25:51.267 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:51.267 "is_configured": true, 00:25:51.267 "data_offset": 2048, 00:25:51.267 "data_size": 63488 00:25:51.267 } 00:25:51.267 ] 00:25:51.267 }' 00:25:51.267 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:51.267 [2024-07-12 10:52:26.445837] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:51.267 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:51.267 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:51.525 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:51.525 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:51.525 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:51.525 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:51.525 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:51.525 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:51.525 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:51.525 10:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:51.784 [2024-07-12 10:52:26.730129] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:51.784 [2024-07-12 10:52:26.808554] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:52.042 [2024-07-12 10:52:27.019152] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x9df670 00:25:52.042 [2024-07-12 10:52:27.019178] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xb148f0 00:25:52.042 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:52.042 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:52.042 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:52.042 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:52.042 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:52.042 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:52.042 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:52.042 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.042 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.042 [2024-07-12 10:52:27.149409] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:52.042 [2024-07-12 10:52:27.149901] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:52.302 "name": "raid_bdev1", 00:25:52.302 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:52.302 "strip_size_kb": 0, 00:25:52.302 "state": "online", 00:25:52.302 "raid_level": "raid1", 00:25:52.302 "superblock": true, 00:25:52.302 "num_base_bdevs": 4, 00:25:52.302 "num_base_bdevs_discovered": 3, 00:25:52.302 "num_base_bdevs_operational": 3, 00:25:52.302 "process": { 00:25:52.302 "type": "rebuild", 00:25:52.302 "target": "spare", 00:25:52.302 "progress": { 00:25:52.302 "blocks": 22528, 00:25:52.302 "percent": 35 00:25:52.302 } 00:25:52.302 }, 00:25:52.302 "base_bdevs_list": [ 00:25:52.302 { 00:25:52.302 "name": "spare", 00:25:52.302 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:25:52.302 "is_configured": true, 00:25:52.302 "data_offset": 2048, 00:25:52.302 "data_size": 63488 00:25:52.302 }, 00:25:52.302 { 00:25:52.302 "name": null, 00:25:52.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.302 "is_configured": false, 00:25:52.302 "data_offset": 2048, 00:25:52.302 "data_size": 63488 00:25:52.302 }, 00:25:52.302 { 00:25:52.302 "name": "BaseBdev3", 00:25:52.302 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:52.302 "is_configured": true, 00:25:52.302 "data_offset": 2048, 00:25:52.302 "data_size": 63488 00:25:52.302 }, 00:25:52.302 { 00:25:52.302 "name": "BaseBdev4", 00:25:52.302 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:52.302 "is_configured": true, 00:25:52.302 "data_offset": 2048, 00:25:52.302 "data_size": 63488 00:25:52.302 } 00:25:52.302 ] 00:25:52.302 }' 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=943 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.302 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.302 [2024-07-12 10:52:27.484409] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:52.302 [2024-07-12 10:52:27.484965] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:52.561 [2024-07-12 10:52:27.644889] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:52.561 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:52.561 "name": "raid_bdev1", 00:25:52.561 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:52.561 "strip_size_kb": 0, 00:25:52.561 "state": "online", 00:25:52.561 "raid_level": "raid1", 00:25:52.561 "superblock": true, 00:25:52.561 "num_base_bdevs": 4, 00:25:52.561 "num_base_bdevs_discovered": 3, 00:25:52.561 "num_base_bdevs_operational": 3, 00:25:52.561 "process": { 00:25:52.561 "type": "rebuild", 00:25:52.561 "target": "spare", 00:25:52.561 "progress": { 00:25:52.561 "blocks": 26624, 00:25:52.561 "percent": 41 00:25:52.561 } 00:25:52.561 }, 00:25:52.561 "base_bdevs_list": [ 00:25:52.561 { 00:25:52.561 "name": "spare", 00:25:52.561 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:25:52.561 "is_configured": true, 00:25:52.561 "data_offset": 2048, 00:25:52.561 "data_size": 63488 00:25:52.562 }, 00:25:52.562 { 00:25:52.562 "name": null, 00:25:52.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.562 "is_configured": false, 00:25:52.562 "data_offset": 2048, 00:25:52.562 "data_size": 63488 00:25:52.562 }, 00:25:52.562 { 00:25:52.562 "name": "BaseBdev3", 00:25:52.562 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:52.562 "is_configured": true, 00:25:52.562 "data_offset": 2048, 00:25:52.562 "data_size": 63488 00:25:52.562 }, 00:25:52.562 { 00:25:52.562 "name": "BaseBdev4", 00:25:52.562 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:52.562 "is_configured": true, 00:25:52.562 "data_offset": 2048, 00:25:52.562 "data_size": 63488 00:25:52.562 } 00:25:52.562 ] 00:25:52.562 }' 00:25:52.562 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:52.562 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:52.562 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:52.562 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:52.562 10:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:53.129 [2024-07-12 10:52:28.177399] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:25:53.388 [2024-07-12 10:52:28.537954] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:53.388 [2024-07-12 10:52:28.538262] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:53.647 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:53.647 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:53.647 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:53.647 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:53.647 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:53.647 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:53.647 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.647 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.647 [2024-07-12 10:52:28.768831] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:25:53.939 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:53.939 "name": "raid_bdev1", 00:25:53.939 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:53.939 "strip_size_kb": 0, 00:25:53.939 "state": "online", 00:25:53.939 "raid_level": "raid1", 00:25:53.939 "superblock": true, 00:25:53.939 "num_base_bdevs": 4, 00:25:53.939 "num_base_bdevs_discovered": 3, 00:25:53.939 "num_base_bdevs_operational": 3, 00:25:53.939 "process": { 00:25:53.939 "type": "rebuild", 00:25:53.939 "target": "spare", 00:25:53.939 "progress": { 00:25:53.939 "blocks": 43008, 00:25:53.939 "percent": 67 00:25:53.939 } 00:25:53.939 }, 00:25:53.939 "base_bdevs_list": [ 00:25:53.939 { 00:25:53.939 "name": "spare", 00:25:53.939 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:25:53.939 "is_configured": true, 00:25:53.939 "data_offset": 2048, 00:25:53.939 "data_size": 63488 00:25:53.939 }, 00:25:53.939 { 00:25:53.939 "name": null, 00:25:53.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.939 "is_configured": false, 00:25:53.939 "data_offset": 2048, 00:25:53.939 "data_size": 63488 00:25:53.939 }, 00:25:53.939 { 00:25:53.939 "name": "BaseBdev3", 00:25:53.939 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:53.939 "is_configured": true, 00:25:53.939 "data_offset": 2048, 00:25:53.939 "data_size": 63488 00:25:53.939 }, 00:25:53.939 { 00:25:53.939 "name": "BaseBdev4", 00:25:53.939 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:53.939 "is_configured": true, 00:25:53.939 "data_offset": 2048, 00:25:53.939 "data_size": 63488 00:25:53.939 } 00:25:53.939 ] 00:25:53.939 }' 00:25:53.939 10:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:53.939 10:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:53.939 10:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:53.939 10:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:53.939 10:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:54.222 [2024-07-12 10:52:29.349153] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:25:54.481 [2024-07-12 10:52:29.563144] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:25:54.741 [2024-07-12 10:52:29.905651] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:25:55.000 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:55.000 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:55.000 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.000 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:55.000 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:55.000 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.000 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.000 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.259 [2024-07-12 10:52:30.246666] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:55.259 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:55.259 "name": "raid_bdev1", 00:25:55.259 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:55.259 "strip_size_kb": 0, 00:25:55.259 "state": "online", 00:25:55.259 "raid_level": "raid1", 00:25:55.259 "superblock": true, 00:25:55.259 "num_base_bdevs": 4, 00:25:55.259 "num_base_bdevs_discovered": 3, 00:25:55.259 "num_base_bdevs_operational": 3, 00:25:55.259 "process": { 00:25:55.259 "type": "rebuild", 00:25:55.259 "target": "spare", 00:25:55.259 "progress": { 00:25:55.259 "blocks": 63488, 00:25:55.259 "percent": 100 00:25:55.259 } 00:25:55.259 }, 00:25:55.259 "base_bdevs_list": [ 00:25:55.259 { 00:25:55.259 "name": "spare", 00:25:55.259 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:25:55.259 "is_configured": true, 00:25:55.259 "data_offset": 2048, 00:25:55.259 "data_size": 63488 00:25:55.259 }, 00:25:55.259 { 00:25:55.259 "name": null, 00:25:55.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.259 "is_configured": false, 00:25:55.259 "data_offset": 2048, 00:25:55.259 "data_size": 63488 00:25:55.259 }, 00:25:55.259 { 00:25:55.259 "name": "BaseBdev3", 00:25:55.259 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:55.259 "is_configured": true, 00:25:55.259 "data_offset": 2048, 00:25:55.259 "data_size": 63488 00:25:55.259 }, 00:25:55.259 { 00:25:55.259 "name": "BaseBdev4", 00:25:55.259 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:55.259 "is_configured": true, 00:25:55.259 "data_offset": 2048, 00:25:55.259 "data_size": 63488 00:25:55.259 } 00:25:55.259 ] 00:25:55.259 }' 00:25:55.259 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:55.259 [2024-07-12 10:52:30.346884] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:55.259 [2024-07-12 10:52:30.357377] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:55.259 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:55.259 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:55.259 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:55.259 10:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:56.635 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:56.635 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:56.635 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:56.635 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:56.635 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:56.635 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:56.635 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.635 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.635 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:56.635 "name": "raid_bdev1", 00:25:56.635 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:56.635 "strip_size_kb": 0, 00:25:56.635 "state": "online", 00:25:56.635 "raid_level": "raid1", 00:25:56.635 "superblock": true, 00:25:56.635 "num_base_bdevs": 4, 00:25:56.635 "num_base_bdevs_discovered": 3, 00:25:56.636 "num_base_bdevs_operational": 3, 00:25:56.636 "base_bdevs_list": [ 00:25:56.636 { 00:25:56.636 "name": "spare", 00:25:56.636 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:25:56.636 "is_configured": true, 00:25:56.636 "data_offset": 2048, 00:25:56.636 "data_size": 63488 00:25:56.636 }, 00:25:56.636 { 00:25:56.636 "name": null, 00:25:56.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.636 "is_configured": false, 00:25:56.636 "data_offset": 2048, 00:25:56.636 "data_size": 63488 00:25:56.636 }, 00:25:56.636 { 00:25:56.636 "name": "BaseBdev3", 00:25:56.636 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:56.636 "is_configured": true, 00:25:56.636 "data_offset": 2048, 00:25:56.636 "data_size": 63488 00:25:56.636 }, 00:25:56.636 { 00:25:56.636 "name": "BaseBdev4", 00:25:56.636 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:56.636 "is_configured": true, 00:25:56.636 "data_offset": 2048, 00:25:56.636 "data_size": 63488 00:25:56.636 } 00:25:56.636 ] 00:25:56.636 }' 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.636 10:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.894 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:56.894 "name": "raid_bdev1", 00:25:56.894 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:56.894 "strip_size_kb": 0, 00:25:56.894 "state": "online", 00:25:56.894 "raid_level": "raid1", 00:25:56.894 "superblock": true, 00:25:56.894 "num_base_bdevs": 4, 00:25:56.894 "num_base_bdevs_discovered": 3, 00:25:56.894 "num_base_bdevs_operational": 3, 00:25:56.894 "base_bdevs_list": [ 00:25:56.894 { 00:25:56.894 "name": "spare", 00:25:56.894 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:25:56.894 "is_configured": true, 00:25:56.894 "data_offset": 2048, 00:25:56.894 "data_size": 63488 00:25:56.894 }, 00:25:56.894 { 00:25:56.894 "name": null, 00:25:56.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.894 "is_configured": false, 00:25:56.894 "data_offset": 2048, 00:25:56.894 "data_size": 63488 00:25:56.894 }, 00:25:56.894 { 00:25:56.894 "name": "BaseBdev3", 00:25:56.894 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:56.894 "is_configured": true, 00:25:56.894 "data_offset": 2048, 00:25:56.894 "data_size": 63488 00:25:56.894 }, 00:25:56.894 { 00:25:56.894 "name": "BaseBdev4", 00:25:56.894 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:56.894 "is_configured": true, 00:25:56.894 "data_offset": 2048, 00:25:56.894 "data_size": 63488 00:25:56.894 } 00:25:56.894 ] 00:25:56.894 }' 00:25:56.894 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.894 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:56.894 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.153 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.412 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:57.412 "name": "raid_bdev1", 00:25:57.412 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:25:57.412 "strip_size_kb": 0, 00:25:57.412 "state": "online", 00:25:57.412 "raid_level": "raid1", 00:25:57.412 "superblock": true, 00:25:57.412 "num_base_bdevs": 4, 00:25:57.412 "num_base_bdevs_discovered": 3, 00:25:57.412 "num_base_bdevs_operational": 3, 00:25:57.412 "base_bdevs_list": [ 00:25:57.412 { 00:25:57.412 "name": "spare", 00:25:57.412 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:25:57.412 "is_configured": true, 00:25:57.412 "data_offset": 2048, 00:25:57.412 "data_size": 63488 00:25:57.412 }, 00:25:57.412 { 00:25:57.412 "name": null, 00:25:57.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.412 "is_configured": false, 00:25:57.412 "data_offset": 2048, 00:25:57.412 "data_size": 63488 00:25:57.412 }, 00:25:57.412 { 00:25:57.412 "name": "BaseBdev3", 00:25:57.412 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:25:57.412 "is_configured": true, 00:25:57.412 "data_offset": 2048, 00:25:57.412 "data_size": 63488 00:25:57.412 }, 00:25:57.412 { 00:25:57.412 "name": "BaseBdev4", 00:25:57.412 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:25:57.412 "is_configured": true, 00:25:57.412 "data_offset": 2048, 00:25:57.412 "data_size": 63488 00:25:57.412 } 00:25:57.412 ] 00:25:57.412 }' 00:25:57.412 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:57.412 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:57.990 10:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:58.251 [2024-07-12 10:52:33.197424] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:58.251 [2024-07-12 10:52:33.197456] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:58.251 00:25:58.251 Latency(us) 00:25:58.251 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:58.251 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:58.251 raid_bdev1 : 12.64 88.37 265.11 0.00 0.00 15790.06 306.31 118534.68 00:25:58.251 =================================================================================================================== 00:25:58.251 Total : 88.37 265.11 0.00 0.00 15790.06 306.31 118534.68 00:25:58.251 [2024-07-12 10:52:33.237499] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:58.251 [2024-07-12 10:52:33.237530] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:58.251 [2024-07-12 10:52:33.237626] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:58.251 [2024-07-12 10:52:33.237639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb108a0 name raid_bdev1, state offline 00:25:58.251 0 00:25:58.251 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.251 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:58.510 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:58.768 /dev/nbd0 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:58.768 1+0 records in 00:25:58.768 1+0 records out 00:25:58.768 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292208 s, 14.0 MB/s 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:58.768 10:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:59.026 /dev/nbd1 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:59.026 1+0 records in 00:25:59.026 1+0 records out 00:25:59.026 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234839 s, 17.4 MB/s 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:59.026 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:59.285 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:59.543 /dev/nbd1 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:59.543 1+0 records in 00:25:59.543 1+0 records out 00:25:59.543 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248587 s, 16.5 MB/s 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:59.543 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:59.801 10:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:00.058 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:00.316 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:00.575 [2024-07-12 10:52:35.714683] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:00.575 [2024-07-12 10:52:35.714727] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:00.575 [2024-07-12 10:52:35.714748] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb11250 00:26:00.575 [2024-07-12 10:52:35.714760] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:00.575 [2024-07-12 10:52:35.716372] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:00.575 [2024-07-12 10:52:35.716400] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:00.575 [2024-07-12 10:52:35.716478] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:00.575 [2024-07-12 10:52:35.716519] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:00.575 [2024-07-12 10:52:35.716625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:00.575 [2024-07-12 10:52:35.716700] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:00.575 spare 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.575 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.834 [2024-07-12 10:52:35.817023] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb11cb0 00:26:00.834 [2024-07-12 10:52:35.817040] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:00.834 [2024-07-12 10:52:35.817236] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb114e0 00:26:00.834 [2024-07-12 10:52:35.817386] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb11cb0 00:26:00.834 [2024-07-12 10:52:35.817397] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb11cb0 00:26:00.834 [2024-07-12 10:52:35.817517] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:00.834 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:00.834 "name": "raid_bdev1", 00:26:00.834 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:00.834 "strip_size_kb": 0, 00:26:00.834 "state": "online", 00:26:00.834 "raid_level": "raid1", 00:26:00.834 "superblock": true, 00:26:00.834 "num_base_bdevs": 4, 00:26:00.834 "num_base_bdevs_discovered": 3, 00:26:00.834 "num_base_bdevs_operational": 3, 00:26:00.834 "base_bdevs_list": [ 00:26:00.834 { 00:26:00.834 "name": "spare", 00:26:00.834 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:26:00.834 "is_configured": true, 00:26:00.834 "data_offset": 2048, 00:26:00.834 "data_size": 63488 00:26:00.834 }, 00:26:00.834 { 00:26:00.834 "name": null, 00:26:00.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:00.834 "is_configured": false, 00:26:00.834 "data_offset": 2048, 00:26:00.834 "data_size": 63488 00:26:00.834 }, 00:26:00.834 { 00:26:00.834 "name": "BaseBdev3", 00:26:00.834 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:00.834 "is_configured": true, 00:26:00.834 "data_offset": 2048, 00:26:00.834 "data_size": 63488 00:26:00.834 }, 00:26:00.834 { 00:26:00.834 "name": "BaseBdev4", 00:26:00.834 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:00.834 "is_configured": true, 00:26:00.834 "data_offset": 2048, 00:26:00.834 "data_size": 63488 00:26:00.834 } 00:26:00.834 ] 00:26:00.834 }' 00:26:00.834 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:00.834 10:52:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:01.399 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:01.399 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:01.399 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:01.399 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:01.399 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:01.658 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.658 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.658 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:01.658 "name": "raid_bdev1", 00:26:01.658 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:01.658 "strip_size_kb": 0, 00:26:01.658 "state": "online", 00:26:01.658 "raid_level": "raid1", 00:26:01.658 "superblock": true, 00:26:01.658 "num_base_bdevs": 4, 00:26:01.658 "num_base_bdevs_discovered": 3, 00:26:01.658 "num_base_bdevs_operational": 3, 00:26:01.658 "base_bdevs_list": [ 00:26:01.658 { 00:26:01.658 "name": "spare", 00:26:01.658 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:26:01.658 "is_configured": true, 00:26:01.658 "data_offset": 2048, 00:26:01.658 "data_size": 63488 00:26:01.658 }, 00:26:01.658 { 00:26:01.658 "name": null, 00:26:01.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.658 "is_configured": false, 00:26:01.658 "data_offset": 2048, 00:26:01.658 "data_size": 63488 00:26:01.658 }, 00:26:01.658 { 00:26:01.658 "name": "BaseBdev3", 00:26:01.658 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:01.658 "is_configured": true, 00:26:01.658 "data_offset": 2048, 00:26:01.658 "data_size": 63488 00:26:01.658 }, 00:26:01.658 { 00:26:01.658 "name": "BaseBdev4", 00:26:01.658 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:01.658 "is_configured": true, 00:26:01.658 "data_offset": 2048, 00:26:01.658 "data_size": 63488 00:26:01.658 } 00:26:01.658 ] 00:26:01.658 }' 00:26:01.658 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:01.658 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:01.658 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:01.658 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:01.916 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.916 10:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:01.916 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:01.917 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:02.175 [2024-07-12 10:52:37.319266] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.175 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.433 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:02.433 "name": "raid_bdev1", 00:26:02.433 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:02.433 "strip_size_kb": 0, 00:26:02.433 "state": "online", 00:26:02.433 "raid_level": "raid1", 00:26:02.433 "superblock": true, 00:26:02.433 "num_base_bdevs": 4, 00:26:02.433 "num_base_bdevs_discovered": 2, 00:26:02.433 "num_base_bdevs_operational": 2, 00:26:02.433 "base_bdevs_list": [ 00:26:02.433 { 00:26:02.433 "name": null, 00:26:02.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.433 "is_configured": false, 00:26:02.433 "data_offset": 2048, 00:26:02.433 "data_size": 63488 00:26:02.433 }, 00:26:02.433 { 00:26:02.433 "name": null, 00:26:02.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.433 "is_configured": false, 00:26:02.433 "data_offset": 2048, 00:26:02.433 "data_size": 63488 00:26:02.433 }, 00:26:02.433 { 00:26:02.433 "name": "BaseBdev3", 00:26:02.433 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:02.433 "is_configured": true, 00:26:02.433 "data_offset": 2048, 00:26:02.433 "data_size": 63488 00:26:02.433 }, 00:26:02.433 { 00:26:02.433 "name": "BaseBdev4", 00:26:02.433 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:02.433 "is_configured": true, 00:26:02.433 "data_offset": 2048, 00:26:02.433 "data_size": 63488 00:26:02.433 } 00:26:02.433 ] 00:26:02.433 }' 00:26:02.433 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:02.433 10:52:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:02.998 10:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:03.256 [2024-07-12 10:52:38.414339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:03.256 [2024-07-12 10:52:38.414495] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:03.256 [2024-07-12 10:52:38.414512] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:03.256 [2024-07-12 10:52:38.414539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:03.256 [2024-07-12 10:52:38.418894] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ded50 00:26:03.256 [2024-07-12 10:52:38.421041] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:03.257 10:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.631 "name": "raid_bdev1", 00:26:04.631 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:04.631 "strip_size_kb": 0, 00:26:04.631 "state": "online", 00:26:04.631 "raid_level": "raid1", 00:26:04.631 "superblock": true, 00:26:04.631 "num_base_bdevs": 4, 00:26:04.631 "num_base_bdevs_discovered": 3, 00:26:04.631 "num_base_bdevs_operational": 3, 00:26:04.631 "process": { 00:26:04.631 "type": "rebuild", 00:26:04.631 "target": "spare", 00:26:04.631 "progress": { 00:26:04.631 "blocks": 22528, 00:26:04.631 "percent": 35 00:26:04.631 } 00:26:04.631 }, 00:26:04.631 "base_bdevs_list": [ 00:26:04.631 { 00:26:04.631 "name": "spare", 00:26:04.631 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:26:04.631 "is_configured": true, 00:26:04.631 "data_offset": 2048, 00:26:04.631 "data_size": 63488 00:26:04.631 }, 00:26:04.631 { 00:26:04.631 "name": null, 00:26:04.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.631 "is_configured": false, 00:26:04.631 "data_offset": 2048, 00:26:04.631 "data_size": 63488 00:26:04.631 }, 00:26:04.631 { 00:26:04.631 "name": "BaseBdev3", 00:26:04.631 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:04.631 "is_configured": true, 00:26:04.631 "data_offset": 2048, 00:26:04.631 "data_size": 63488 00:26:04.631 }, 00:26:04.631 { 00:26:04.631 "name": "BaseBdev4", 00:26:04.631 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:04.631 "is_configured": true, 00:26:04.631 "data_offset": 2048, 00:26:04.631 "data_size": 63488 00:26:04.631 } 00:26:04.631 ] 00:26:04.631 }' 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:04.631 10:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:04.889 [2024-07-12 10:52:39.949030] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:04.889 [2024-07-12 10:52:40.033234] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:04.889 [2024-07-12 10:52:40.033278] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:04.889 [2024-07-12 10:52:40.033294] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:04.889 [2024-07-12 10:52:40.033302] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.889 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.152 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:05.152 "name": "raid_bdev1", 00:26:05.152 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:05.152 "strip_size_kb": 0, 00:26:05.152 "state": "online", 00:26:05.152 "raid_level": "raid1", 00:26:05.152 "superblock": true, 00:26:05.152 "num_base_bdevs": 4, 00:26:05.152 "num_base_bdevs_discovered": 2, 00:26:05.152 "num_base_bdevs_operational": 2, 00:26:05.152 "base_bdevs_list": [ 00:26:05.152 { 00:26:05.152 "name": null, 00:26:05.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.152 "is_configured": false, 00:26:05.152 "data_offset": 2048, 00:26:05.152 "data_size": 63488 00:26:05.152 }, 00:26:05.152 { 00:26:05.152 "name": null, 00:26:05.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.152 "is_configured": false, 00:26:05.152 "data_offset": 2048, 00:26:05.152 "data_size": 63488 00:26:05.152 }, 00:26:05.152 { 00:26:05.152 "name": "BaseBdev3", 00:26:05.152 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:05.152 "is_configured": true, 00:26:05.152 "data_offset": 2048, 00:26:05.152 "data_size": 63488 00:26:05.152 }, 00:26:05.152 { 00:26:05.152 "name": "BaseBdev4", 00:26:05.152 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:05.152 "is_configured": true, 00:26:05.152 "data_offset": 2048, 00:26:05.152 "data_size": 63488 00:26:05.152 } 00:26:05.152 ] 00:26:05.152 }' 00:26:05.152 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:05.152 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:05.718 10:52:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:05.976 [2024-07-12 10:52:41.108486] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:05.976 [2024-07-12 10:52:41.108537] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:05.976 [2024-07-12 10:52:41.108565] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9deeb0 00:26:05.976 [2024-07-12 10:52:41.108578] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:05.976 [2024-07-12 10:52:41.108950] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:05.976 [2024-07-12 10:52:41.108968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:05.976 [2024-07-12 10:52:41.109049] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:05.976 [2024-07-12 10:52:41.109061] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:05.976 [2024-07-12 10:52:41.109071] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:05.976 [2024-07-12 10:52:41.109090] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:05.976 [2024-07-12 10:52:41.113503] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9df280 00:26:05.976 spare 00:26:05.976 [2024-07-12 10:52:41.114975] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:05.976 10:52:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:07.350 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:07.350 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.350 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:07.350 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:07.350 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.350 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.350 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.350 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.350 "name": "raid_bdev1", 00:26:07.350 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:07.350 "strip_size_kb": 0, 00:26:07.350 "state": "online", 00:26:07.350 "raid_level": "raid1", 00:26:07.350 "superblock": true, 00:26:07.350 "num_base_bdevs": 4, 00:26:07.350 "num_base_bdevs_discovered": 3, 00:26:07.350 "num_base_bdevs_operational": 3, 00:26:07.350 "process": { 00:26:07.350 "type": "rebuild", 00:26:07.350 "target": "spare", 00:26:07.350 "progress": { 00:26:07.350 "blocks": 24576, 00:26:07.350 "percent": 38 00:26:07.350 } 00:26:07.350 }, 00:26:07.350 "base_bdevs_list": [ 00:26:07.350 { 00:26:07.350 "name": "spare", 00:26:07.350 "uuid": "5c7585ef-108c-513b-87aa-137beb36603e", 00:26:07.350 "is_configured": true, 00:26:07.350 "data_offset": 2048, 00:26:07.350 "data_size": 63488 00:26:07.350 }, 00:26:07.350 { 00:26:07.351 "name": null, 00:26:07.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.351 "is_configured": false, 00:26:07.351 "data_offset": 2048, 00:26:07.351 "data_size": 63488 00:26:07.351 }, 00:26:07.351 { 00:26:07.351 "name": "BaseBdev3", 00:26:07.351 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:07.351 "is_configured": true, 00:26:07.351 "data_offset": 2048, 00:26:07.351 "data_size": 63488 00:26:07.351 }, 00:26:07.351 { 00:26:07.351 "name": "BaseBdev4", 00:26:07.351 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:07.351 "is_configured": true, 00:26:07.351 "data_offset": 2048, 00:26:07.351 "data_size": 63488 00:26:07.351 } 00:26:07.351 ] 00:26:07.351 }' 00:26:07.351 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.351 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:07.351 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.351 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:07.351 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:07.609 [2024-07-12 10:52:42.698746] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:07.609 [2024-07-12 10:52:42.727682] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:07.609 [2024-07-12 10:52:42.727732] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:07.609 [2024-07-12 10:52:42.727748] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:07.609 [2024-07-12 10:52:42.727757] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.609 10:52:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.867 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:07.867 "name": "raid_bdev1", 00:26:07.867 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:07.867 "strip_size_kb": 0, 00:26:07.867 "state": "online", 00:26:07.867 "raid_level": "raid1", 00:26:07.867 "superblock": true, 00:26:07.867 "num_base_bdevs": 4, 00:26:07.867 "num_base_bdevs_discovered": 2, 00:26:07.867 "num_base_bdevs_operational": 2, 00:26:07.867 "base_bdevs_list": [ 00:26:07.867 { 00:26:07.867 "name": null, 00:26:07.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.867 "is_configured": false, 00:26:07.867 "data_offset": 2048, 00:26:07.867 "data_size": 63488 00:26:07.867 }, 00:26:07.867 { 00:26:07.867 "name": null, 00:26:07.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.867 "is_configured": false, 00:26:07.867 "data_offset": 2048, 00:26:07.867 "data_size": 63488 00:26:07.867 }, 00:26:07.867 { 00:26:07.867 "name": "BaseBdev3", 00:26:07.867 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:07.867 "is_configured": true, 00:26:07.867 "data_offset": 2048, 00:26:07.867 "data_size": 63488 00:26:07.867 }, 00:26:07.867 { 00:26:07.867 "name": "BaseBdev4", 00:26:07.867 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:07.867 "is_configured": true, 00:26:07.867 "data_offset": 2048, 00:26:07.867 "data_size": 63488 00:26:07.867 } 00:26:07.867 ] 00:26:07.867 }' 00:26:07.867 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:07.868 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:08.443 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:08.443 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:08.443 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:08.443 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:08.443 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:08.751 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.751 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.751 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:08.751 "name": "raid_bdev1", 00:26:08.751 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:08.751 "strip_size_kb": 0, 00:26:08.751 "state": "online", 00:26:08.751 "raid_level": "raid1", 00:26:08.751 "superblock": true, 00:26:08.751 "num_base_bdevs": 4, 00:26:08.751 "num_base_bdevs_discovered": 2, 00:26:08.751 "num_base_bdevs_operational": 2, 00:26:08.751 "base_bdevs_list": [ 00:26:08.751 { 00:26:08.751 "name": null, 00:26:08.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.751 "is_configured": false, 00:26:08.751 "data_offset": 2048, 00:26:08.751 "data_size": 63488 00:26:08.751 }, 00:26:08.751 { 00:26:08.751 "name": null, 00:26:08.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.751 "is_configured": false, 00:26:08.751 "data_offset": 2048, 00:26:08.751 "data_size": 63488 00:26:08.751 }, 00:26:08.751 { 00:26:08.751 "name": "BaseBdev3", 00:26:08.751 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:08.751 "is_configured": true, 00:26:08.751 "data_offset": 2048, 00:26:08.751 "data_size": 63488 00:26:08.751 }, 00:26:08.751 { 00:26:08.751 "name": "BaseBdev4", 00:26:08.751 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:08.751 "is_configured": true, 00:26:08.751 "data_offset": 2048, 00:26:08.751 "data_size": 63488 00:26:08.751 } 00:26:08.751 ] 00:26:08.751 }' 00:26:08.751 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:08.751 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:08.751 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:08.751 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:08.751 10:52:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:09.009 10:52:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:09.357 [2024-07-12 10:52:44.348461] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:09.357 [2024-07-12 10:52:44.348567] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:09.357 [2024-07-12 10:52:44.348592] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb169c0 00:26:09.357 [2024-07-12 10:52:44.348605] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:09.357 [2024-07-12 10:52:44.348952] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:09.357 [2024-07-12 10:52:44.348969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:09.357 [2024-07-12 10:52:44.349042] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:09.357 [2024-07-12 10:52:44.349054] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:09.357 [2024-07-12 10:52:44.349065] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:09.357 BaseBdev1 00:26:09.357 10:52:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.289 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.547 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:10.547 "name": "raid_bdev1", 00:26:10.547 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:10.547 "strip_size_kb": 0, 00:26:10.547 "state": "online", 00:26:10.547 "raid_level": "raid1", 00:26:10.547 "superblock": true, 00:26:10.547 "num_base_bdevs": 4, 00:26:10.547 "num_base_bdevs_discovered": 2, 00:26:10.547 "num_base_bdevs_operational": 2, 00:26:10.547 "base_bdevs_list": [ 00:26:10.547 { 00:26:10.547 "name": null, 00:26:10.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.547 "is_configured": false, 00:26:10.547 "data_offset": 2048, 00:26:10.547 "data_size": 63488 00:26:10.547 }, 00:26:10.547 { 00:26:10.547 "name": null, 00:26:10.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.547 "is_configured": false, 00:26:10.547 "data_offset": 2048, 00:26:10.547 "data_size": 63488 00:26:10.547 }, 00:26:10.547 { 00:26:10.547 "name": "BaseBdev3", 00:26:10.547 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:10.547 "is_configured": true, 00:26:10.547 "data_offset": 2048, 00:26:10.547 "data_size": 63488 00:26:10.547 }, 00:26:10.547 { 00:26:10.547 "name": "BaseBdev4", 00:26:10.547 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:10.547 "is_configured": true, 00:26:10.547 "data_offset": 2048, 00:26:10.547 "data_size": 63488 00:26:10.547 } 00:26:10.547 ] 00:26:10.547 }' 00:26:10.547 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:10.547 10:52:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:11.112 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:11.112 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.112 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:11.112 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:11.112 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.112 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.112 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.369 "name": "raid_bdev1", 00:26:11.369 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:11.369 "strip_size_kb": 0, 00:26:11.369 "state": "online", 00:26:11.369 "raid_level": "raid1", 00:26:11.369 "superblock": true, 00:26:11.369 "num_base_bdevs": 4, 00:26:11.369 "num_base_bdevs_discovered": 2, 00:26:11.369 "num_base_bdevs_operational": 2, 00:26:11.369 "base_bdevs_list": [ 00:26:11.369 { 00:26:11.369 "name": null, 00:26:11.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.369 "is_configured": false, 00:26:11.369 "data_offset": 2048, 00:26:11.369 "data_size": 63488 00:26:11.369 }, 00:26:11.369 { 00:26:11.369 "name": null, 00:26:11.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.369 "is_configured": false, 00:26:11.369 "data_offset": 2048, 00:26:11.369 "data_size": 63488 00:26:11.369 }, 00:26:11.369 { 00:26:11.369 "name": "BaseBdev3", 00:26:11.369 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:11.369 "is_configured": true, 00:26:11.369 "data_offset": 2048, 00:26:11.369 "data_size": 63488 00:26:11.369 }, 00:26:11.369 { 00:26:11.369 "name": "BaseBdev4", 00:26:11.369 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:11.369 "is_configured": true, 00:26:11.369 "data_offset": 2048, 00:26:11.369 "data_size": 63488 00:26:11.369 } 00:26:11.369 ] 00:26:11.369 }' 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:11.369 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:11.626 [2024-07-12 10:52:46.727086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:11.626 [2024-07-12 10:52:46.727211] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:11.626 [2024-07-12 10:52:46.727227] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:11.626 request: 00:26:11.626 { 00:26:11.626 "base_bdev": "BaseBdev1", 00:26:11.626 "raid_bdev": "raid_bdev1", 00:26:11.626 "method": "bdev_raid_add_base_bdev", 00:26:11.626 "req_id": 1 00:26:11.626 } 00:26:11.626 Got JSON-RPC error response 00:26:11.626 response: 00:26:11.626 { 00:26:11.626 "code": -22, 00:26:11.626 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:11.626 } 00:26:11.626 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:26:11.626 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:11.626 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:11.626 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:11.626 10:52:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:12.996 "name": "raid_bdev1", 00:26:12.996 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:12.996 "strip_size_kb": 0, 00:26:12.996 "state": "online", 00:26:12.996 "raid_level": "raid1", 00:26:12.996 "superblock": true, 00:26:12.996 "num_base_bdevs": 4, 00:26:12.996 "num_base_bdevs_discovered": 2, 00:26:12.996 "num_base_bdevs_operational": 2, 00:26:12.996 "base_bdevs_list": [ 00:26:12.996 { 00:26:12.996 "name": null, 00:26:12.996 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.996 "is_configured": false, 00:26:12.996 "data_offset": 2048, 00:26:12.996 "data_size": 63488 00:26:12.996 }, 00:26:12.996 { 00:26:12.996 "name": null, 00:26:12.996 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.996 "is_configured": false, 00:26:12.996 "data_offset": 2048, 00:26:12.996 "data_size": 63488 00:26:12.996 }, 00:26:12.996 { 00:26:12.996 "name": "BaseBdev3", 00:26:12.996 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:12.996 "is_configured": true, 00:26:12.996 "data_offset": 2048, 00:26:12.996 "data_size": 63488 00:26:12.996 }, 00:26:12.996 { 00:26:12.996 "name": "BaseBdev4", 00:26:12.996 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:12.996 "is_configured": true, 00:26:12.996 "data_offset": 2048, 00:26:12.996 "data_size": 63488 00:26:12.996 } 00:26:12.996 ] 00:26:12.996 }' 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:12.996 10:52:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:13.561 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:13.561 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:13.561 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:13.561 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:13.561 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:13.561 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.561 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.829 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:13.829 "name": "raid_bdev1", 00:26:13.829 "uuid": "e576c561-d762-440e-976c-a64fb10e15c2", 00:26:13.829 "strip_size_kb": 0, 00:26:13.829 "state": "online", 00:26:13.829 "raid_level": "raid1", 00:26:13.829 "superblock": true, 00:26:13.829 "num_base_bdevs": 4, 00:26:13.829 "num_base_bdevs_discovered": 2, 00:26:13.829 "num_base_bdevs_operational": 2, 00:26:13.830 "base_bdevs_list": [ 00:26:13.830 { 00:26:13.830 "name": null, 00:26:13.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.830 "is_configured": false, 00:26:13.830 "data_offset": 2048, 00:26:13.830 "data_size": 63488 00:26:13.830 }, 00:26:13.830 { 00:26:13.830 "name": null, 00:26:13.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.830 "is_configured": false, 00:26:13.830 "data_offset": 2048, 00:26:13.830 "data_size": 63488 00:26:13.830 }, 00:26:13.830 { 00:26:13.830 "name": "BaseBdev3", 00:26:13.830 "uuid": "d30777c1-5a15-5819-b3aa-31759af5f50c", 00:26:13.830 "is_configured": true, 00:26:13.830 "data_offset": 2048, 00:26:13.830 "data_size": 63488 00:26:13.830 }, 00:26:13.830 { 00:26:13.830 "name": "BaseBdev4", 00:26:13.830 "uuid": "17184e4c-7225-55eb-8888-1ce880f67488", 00:26:13.830 "is_configured": true, 00:26:13.830 "data_offset": 2048, 00:26:13.830 "data_size": 63488 00:26:13.830 } 00:26:13.830 ] 00:26:13.830 }' 00:26:13.830 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:13.830 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:13.830 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:13.830 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:13.830 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2148488 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2148488 ']' 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2148488 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2148488 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2148488' 00:26:13.831 killing process with pid 2148488 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2148488 00:26:13.831 Received shutdown signal, test time was about 28.291735 seconds 00:26:13.831 00:26:13.831 Latency(us) 00:26:13.831 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:13.831 =================================================================================================================== 00:26:13.831 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:13.831 [2024-07-12 10:52:48.923450] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:13.831 [2024-07-12 10:52:48.923560] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:13.831 [2024-07-12 10:52:48.923628] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:13.831 [2024-07-12 10:52:48.923640] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb11cb0 name raid_bdev1, state offline 00:26:13.831 10:52:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2148488 00:26:13.831 [2024-07-12 10:52:48.964954] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:14.096 10:52:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:14.096 00:26:14.096 real 0m34.038s 00:26:14.096 user 0m53.084s 00:26:14.096 sys 0m5.333s 00:26:14.096 10:52:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:14.096 10:52:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:14.096 ************************************ 00:26:14.096 END TEST raid_rebuild_test_sb_io 00:26:14.096 ************************************ 00:26:14.096 10:52:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:14.096 10:52:49 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:26:14.096 10:52:49 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:26:14.096 10:52:49 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:26:14.096 10:52:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:14.096 10:52:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:14.096 10:52:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:14.096 ************************************ 00:26:14.096 START TEST raid_state_function_test_sb_4k 00:26:14.096 ************************************ 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=2153344 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2153344' 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:14.096 Process raid pid: 2153344 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 2153344 /var/tmp/spdk-raid.sock 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2153344 ']' 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:14.096 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:14.096 10:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:14.354 [2024-07-12 10:52:49.326329] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:26:14.354 [2024-07-12 10:52:49.326396] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:14.354 [2024-07-12 10:52:49.452919] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:14.612 [2024-07-12 10:52:49.556845] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:14.612 [2024-07-12 10:52:49.622109] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:14.612 [2024-07-12 10:52:49.622144] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:15.178 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:15.178 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:15.178 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:15.437 [2024-07-12 10:52:50.502602] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:15.437 [2024-07-12 10:52:50.502645] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:15.437 [2024-07-12 10:52:50.502656] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:15.437 [2024-07-12 10:52:50.502668] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.437 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:15.696 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:15.696 "name": "Existed_Raid", 00:26:15.696 "uuid": "3d4959c6-bab2-4717-a886-645ad3c7e438", 00:26:15.696 "strip_size_kb": 0, 00:26:15.696 "state": "configuring", 00:26:15.696 "raid_level": "raid1", 00:26:15.696 "superblock": true, 00:26:15.696 "num_base_bdevs": 2, 00:26:15.696 "num_base_bdevs_discovered": 0, 00:26:15.696 "num_base_bdevs_operational": 2, 00:26:15.696 "base_bdevs_list": [ 00:26:15.696 { 00:26:15.696 "name": "BaseBdev1", 00:26:15.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.696 "is_configured": false, 00:26:15.696 "data_offset": 0, 00:26:15.696 "data_size": 0 00:26:15.696 }, 00:26:15.696 { 00:26:15.696 "name": "BaseBdev2", 00:26:15.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.696 "is_configured": false, 00:26:15.696 "data_offset": 0, 00:26:15.696 "data_size": 0 00:26:15.696 } 00:26:15.696 ] 00:26:15.696 }' 00:26:15.696 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:15.696 10:52:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:16.263 10:52:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:16.521 [2024-07-12 10:52:51.609395] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:16.521 [2024-07-12 10:52:51.609428] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x249ba80 name Existed_Raid, state configuring 00:26:16.521 10:52:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:16.779 [2024-07-12 10:52:51.854055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:16.779 [2024-07-12 10:52:51.854090] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:16.779 [2024-07-12 10:52:51.854100] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:16.779 [2024-07-12 10:52:51.854111] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:16.779 10:52:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:26:17.037 [2024-07-12 10:52:52.105743] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:17.037 BaseBdev1 00:26:17.037 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:17.037 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:17.037 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:17.037 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:17.037 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:17.037 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:17.037 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:17.296 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:17.554 [ 00:26:17.554 { 00:26:17.554 "name": "BaseBdev1", 00:26:17.554 "aliases": [ 00:26:17.554 "f0bf0c99-e926-4853-aaf9-253fdad3d58b" 00:26:17.554 ], 00:26:17.554 "product_name": "Malloc disk", 00:26:17.554 "block_size": 4096, 00:26:17.554 "num_blocks": 8192, 00:26:17.554 "uuid": "f0bf0c99-e926-4853-aaf9-253fdad3d58b", 00:26:17.554 "assigned_rate_limits": { 00:26:17.554 "rw_ios_per_sec": 0, 00:26:17.554 "rw_mbytes_per_sec": 0, 00:26:17.554 "r_mbytes_per_sec": 0, 00:26:17.554 "w_mbytes_per_sec": 0 00:26:17.554 }, 00:26:17.554 "claimed": true, 00:26:17.554 "claim_type": "exclusive_write", 00:26:17.554 "zoned": false, 00:26:17.554 "supported_io_types": { 00:26:17.554 "read": true, 00:26:17.554 "write": true, 00:26:17.554 "unmap": true, 00:26:17.554 "flush": true, 00:26:17.554 "reset": true, 00:26:17.554 "nvme_admin": false, 00:26:17.554 "nvme_io": false, 00:26:17.554 "nvme_io_md": false, 00:26:17.554 "write_zeroes": true, 00:26:17.554 "zcopy": true, 00:26:17.554 "get_zone_info": false, 00:26:17.554 "zone_management": false, 00:26:17.554 "zone_append": false, 00:26:17.554 "compare": false, 00:26:17.554 "compare_and_write": false, 00:26:17.554 "abort": true, 00:26:17.554 "seek_hole": false, 00:26:17.554 "seek_data": false, 00:26:17.554 "copy": true, 00:26:17.554 "nvme_iov_md": false 00:26:17.554 }, 00:26:17.554 "memory_domains": [ 00:26:17.554 { 00:26:17.554 "dma_device_id": "system", 00:26:17.554 "dma_device_type": 1 00:26:17.554 }, 00:26:17.554 { 00:26:17.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:17.554 "dma_device_type": 2 00:26:17.554 } 00:26:17.554 ], 00:26:17.554 "driver_specific": {} 00:26:17.554 } 00:26:17.554 ] 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.554 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:17.813 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:17.813 "name": "Existed_Raid", 00:26:17.813 "uuid": "9801ae28-d117-4332-96df-e6b04778dfaf", 00:26:17.813 "strip_size_kb": 0, 00:26:17.813 "state": "configuring", 00:26:17.813 "raid_level": "raid1", 00:26:17.813 "superblock": true, 00:26:17.813 "num_base_bdevs": 2, 00:26:17.813 "num_base_bdevs_discovered": 1, 00:26:17.813 "num_base_bdevs_operational": 2, 00:26:17.813 "base_bdevs_list": [ 00:26:17.813 { 00:26:17.813 "name": "BaseBdev1", 00:26:17.813 "uuid": "f0bf0c99-e926-4853-aaf9-253fdad3d58b", 00:26:17.813 "is_configured": true, 00:26:17.813 "data_offset": 256, 00:26:17.813 "data_size": 7936 00:26:17.813 }, 00:26:17.813 { 00:26:17.813 "name": "BaseBdev2", 00:26:17.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:17.813 "is_configured": false, 00:26:17.813 "data_offset": 0, 00:26:17.813 "data_size": 0 00:26:17.813 } 00:26:17.813 ] 00:26:17.813 }' 00:26:17.813 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:17.813 10:52:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:18.377 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:18.634 [2024-07-12 10:52:53.689959] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:18.634 [2024-07-12 10:52:53.690002] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x249b350 name Existed_Raid, state configuring 00:26:18.634 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:18.892 [2024-07-12 10:52:53.930632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:18.892 [2024-07-12 10:52:53.932145] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:18.892 [2024-07-12 10:52:53.932177] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.892 10:52:53 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:19.151 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:19.151 "name": "Existed_Raid", 00:26:19.151 "uuid": "2db05f78-8253-46f2-8c5c-ab92323304e1", 00:26:19.151 "strip_size_kb": 0, 00:26:19.151 "state": "configuring", 00:26:19.151 "raid_level": "raid1", 00:26:19.151 "superblock": true, 00:26:19.151 "num_base_bdevs": 2, 00:26:19.151 "num_base_bdevs_discovered": 1, 00:26:19.151 "num_base_bdevs_operational": 2, 00:26:19.151 "base_bdevs_list": [ 00:26:19.151 { 00:26:19.151 "name": "BaseBdev1", 00:26:19.151 "uuid": "f0bf0c99-e926-4853-aaf9-253fdad3d58b", 00:26:19.151 "is_configured": true, 00:26:19.151 "data_offset": 256, 00:26:19.151 "data_size": 7936 00:26:19.151 }, 00:26:19.151 { 00:26:19.151 "name": "BaseBdev2", 00:26:19.151 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.151 "is_configured": false, 00:26:19.151 "data_offset": 0, 00:26:19.151 "data_size": 0 00:26:19.151 } 00:26:19.151 ] 00:26:19.151 }' 00:26:19.151 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:19.151 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:19.718 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:26:19.977 [2024-07-12 10:52:54.920751] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:19.977 [2024-07-12 10:52:54.920900] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x249c000 00:26:19.977 [2024-07-12 10:52:54.920915] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:19.977 [2024-07-12 10:52:54.921090] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23b60c0 00:26:19.977 [2024-07-12 10:52:54.921209] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x249c000 00:26:19.977 [2024-07-12 10:52:54.921219] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x249c000 00:26:19.977 [2024-07-12 10:52:54.921308] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:19.977 BaseBdev2 00:26:19.977 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:19.977 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:19.977 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:19.977 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:19.977 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:19.977 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:19.977 10:52:54 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:19.977 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:20.235 [ 00:26:20.235 { 00:26:20.235 "name": "BaseBdev2", 00:26:20.235 "aliases": [ 00:26:20.235 "28af12b1-500e-4e72-9e16-7bbbb7d72ccd" 00:26:20.235 ], 00:26:20.235 "product_name": "Malloc disk", 00:26:20.235 "block_size": 4096, 00:26:20.235 "num_blocks": 8192, 00:26:20.235 "uuid": "28af12b1-500e-4e72-9e16-7bbbb7d72ccd", 00:26:20.235 "assigned_rate_limits": { 00:26:20.235 "rw_ios_per_sec": 0, 00:26:20.235 "rw_mbytes_per_sec": 0, 00:26:20.235 "r_mbytes_per_sec": 0, 00:26:20.235 "w_mbytes_per_sec": 0 00:26:20.235 }, 00:26:20.235 "claimed": true, 00:26:20.235 "claim_type": "exclusive_write", 00:26:20.235 "zoned": false, 00:26:20.235 "supported_io_types": { 00:26:20.235 "read": true, 00:26:20.235 "write": true, 00:26:20.235 "unmap": true, 00:26:20.235 "flush": true, 00:26:20.235 "reset": true, 00:26:20.235 "nvme_admin": false, 00:26:20.235 "nvme_io": false, 00:26:20.235 "nvme_io_md": false, 00:26:20.235 "write_zeroes": true, 00:26:20.235 "zcopy": true, 00:26:20.235 "get_zone_info": false, 00:26:20.235 "zone_management": false, 00:26:20.235 "zone_append": false, 00:26:20.235 "compare": false, 00:26:20.235 "compare_and_write": false, 00:26:20.235 "abort": true, 00:26:20.235 "seek_hole": false, 00:26:20.235 "seek_data": false, 00:26:20.235 "copy": true, 00:26:20.235 "nvme_iov_md": false 00:26:20.235 }, 00:26:20.235 "memory_domains": [ 00:26:20.235 { 00:26:20.235 "dma_device_id": "system", 00:26:20.235 "dma_device_type": 1 00:26:20.235 }, 00:26:20.235 { 00:26:20.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:20.235 "dma_device_type": 2 00:26:20.235 } 00:26:20.235 ], 00:26:20.235 "driver_specific": {} 00:26:20.235 } 00:26:20.235 ] 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.235 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:20.494 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:20.494 "name": "Existed_Raid", 00:26:20.494 "uuid": "2db05f78-8253-46f2-8c5c-ab92323304e1", 00:26:20.494 "strip_size_kb": 0, 00:26:20.494 "state": "online", 00:26:20.494 "raid_level": "raid1", 00:26:20.494 "superblock": true, 00:26:20.494 "num_base_bdevs": 2, 00:26:20.494 "num_base_bdevs_discovered": 2, 00:26:20.494 "num_base_bdevs_operational": 2, 00:26:20.494 "base_bdevs_list": [ 00:26:20.494 { 00:26:20.494 "name": "BaseBdev1", 00:26:20.494 "uuid": "f0bf0c99-e926-4853-aaf9-253fdad3d58b", 00:26:20.494 "is_configured": true, 00:26:20.494 "data_offset": 256, 00:26:20.494 "data_size": 7936 00:26:20.494 }, 00:26:20.494 { 00:26:20.494 "name": "BaseBdev2", 00:26:20.494 "uuid": "28af12b1-500e-4e72-9e16-7bbbb7d72ccd", 00:26:20.494 "is_configured": true, 00:26:20.494 "data_offset": 256, 00:26:20.494 "data_size": 7936 00:26:20.494 } 00:26:20.494 ] 00:26:20.494 }' 00:26:20.494 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:20.494 10:52:55 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:21.060 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:21.060 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:21.060 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:21.060 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:21.060 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:21.060 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:21.060 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:21.060 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:21.318 [2024-07-12 10:52:56.320744] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:21.318 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:21.318 "name": "Existed_Raid", 00:26:21.318 "aliases": [ 00:26:21.318 "2db05f78-8253-46f2-8c5c-ab92323304e1" 00:26:21.318 ], 00:26:21.318 "product_name": "Raid Volume", 00:26:21.318 "block_size": 4096, 00:26:21.318 "num_blocks": 7936, 00:26:21.318 "uuid": "2db05f78-8253-46f2-8c5c-ab92323304e1", 00:26:21.318 "assigned_rate_limits": { 00:26:21.318 "rw_ios_per_sec": 0, 00:26:21.318 "rw_mbytes_per_sec": 0, 00:26:21.318 "r_mbytes_per_sec": 0, 00:26:21.318 "w_mbytes_per_sec": 0 00:26:21.318 }, 00:26:21.318 "claimed": false, 00:26:21.318 "zoned": false, 00:26:21.318 "supported_io_types": { 00:26:21.318 "read": true, 00:26:21.318 "write": true, 00:26:21.318 "unmap": false, 00:26:21.318 "flush": false, 00:26:21.318 "reset": true, 00:26:21.318 "nvme_admin": false, 00:26:21.318 "nvme_io": false, 00:26:21.318 "nvme_io_md": false, 00:26:21.318 "write_zeroes": true, 00:26:21.318 "zcopy": false, 00:26:21.318 "get_zone_info": false, 00:26:21.318 "zone_management": false, 00:26:21.318 "zone_append": false, 00:26:21.318 "compare": false, 00:26:21.318 "compare_and_write": false, 00:26:21.318 "abort": false, 00:26:21.318 "seek_hole": false, 00:26:21.318 "seek_data": false, 00:26:21.318 "copy": false, 00:26:21.318 "nvme_iov_md": false 00:26:21.318 }, 00:26:21.318 "memory_domains": [ 00:26:21.318 { 00:26:21.318 "dma_device_id": "system", 00:26:21.318 "dma_device_type": 1 00:26:21.318 }, 00:26:21.318 { 00:26:21.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:21.318 "dma_device_type": 2 00:26:21.318 }, 00:26:21.318 { 00:26:21.318 "dma_device_id": "system", 00:26:21.318 "dma_device_type": 1 00:26:21.318 }, 00:26:21.318 { 00:26:21.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:21.318 "dma_device_type": 2 00:26:21.318 } 00:26:21.318 ], 00:26:21.318 "driver_specific": { 00:26:21.318 "raid": { 00:26:21.318 "uuid": "2db05f78-8253-46f2-8c5c-ab92323304e1", 00:26:21.318 "strip_size_kb": 0, 00:26:21.318 "state": "online", 00:26:21.318 "raid_level": "raid1", 00:26:21.318 "superblock": true, 00:26:21.318 "num_base_bdevs": 2, 00:26:21.318 "num_base_bdevs_discovered": 2, 00:26:21.318 "num_base_bdevs_operational": 2, 00:26:21.318 "base_bdevs_list": [ 00:26:21.318 { 00:26:21.318 "name": "BaseBdev1", 00:26:21.318 "uuid": "f0bf0c99-e926-4853-aaf9-253fdad3d58b", 00:26:21.318 "is_configured": true, 00:26:21.318 "data_offset": 256, 00:26:21.318 "data_size": 7936 00:26:21.318 }, 00:26:21.318 { 00:26:21.318 "name": "BaseBdev2", 00:26:21.318 "uuid": "28af12b1-500e-4e72-9e16-7bbbb7d72ccd", 00:26:21.318 "is_configured": true, 00:26:21.318 "data_offset": 256, 00:26:21.318 "data_size": 7936 00:26:21.318 } 00:26:21.318 ] 00:26:21.318 } 00:26:21.318 } 00:26:21.318 }' 00:26:21.318 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:21.318 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:21.318 BaseBdev2' 00:26:21.318 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:21.318 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:21.318 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:21.576 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:21.576 "name": "BaseBdev1", 00:26:21.576 "aliases": [ 00:26:21.576 "f0bf0c99-e926-4853-aaf9-253fdad3d58b" 00:26:21.576 ], 00:26:21.576 "product_name": "Malloc disk", 00:26:21.576 "block_size": 4096, 00:26:21.576 "num_blocks": 8192, 00:26:21.576 "uuid": "f0bf0c99-e926-4853-aaf9-253fdad3d58b", 00:26:21.576 "assigned_rate_limits": { 00:26:21.576 "rw_ios_per_sec": 0, 00:26:21.576 "rw_mbytes_per_sec": 0, 00:26:21.576 "r_mbytes_per_sec": 0, 00:26:21.576 "w_mbytes_per_sec": 0 00:26:21.576 }, 00:26:21.576 "claimed": true, 00:26:21.576 "claim_type": "exclusive_write", 00:26:21.576 "zoned": false, 00:26:21.576 "supported_io_types": { 00:26:21.576 "read": true, 00:26:21.576 "write": true, 00:26:21.576 "unmap": true, 00:26:21.576 "flush": true, 00:26:21.576 "reset": true, 00:26:21.576 "nvme_admin": false, 00:26:21.576 "nvme_io": false, 00:26:21.576 "nvme_io_md": false, 00:26:21.576 "write_zeroes": true, 00:26:21.576 "zcopy": true, 00:26:21.576 "get_zone_info": false, 00:26:21.576 "zone_management": false, 00:26:21.576 "zone_append": false, 00:26:21.576 "compare": false, 00:26:21.576 "compare_and_write": false, 00:26:21.576 "abort": true, 00:26:21.576 "seek_hole": false, 00:26:21.576 "seek_data": false, 00:26:21.576 "copy": true, 00:26:21.576 "nvme_iov_md": false 00:26:21.576 }, 00:26:21.576 "memory_domains": [ 00:26:21.576 { 00:26:21.576 "dma_device_id": "system", 00:26:21.576 "dma_device_type": 1 00:26:21.576 }, 00:26:21.576 { 00:26:21.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:21.576 "dma_device_type": 2 00:26:21.576 } 00:26:21.576 ], 00:26:21.576 "driver_specific": {} 00:26:21.576 }' 00:26:21.576 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:21.576 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:21.576 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:21.576 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:21.576 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:21.576 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:21.576 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:21.834 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:21.834 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:21.834 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:21.834 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:21.834 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:21.834 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:21.834 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:21.834 10:52:56 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:22.093 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:22.093 "name": "BaseBdev2", 00:26:22.093 "aliases": [ 00:26:22.093 "28af12b1-500e-4e72-9e16-7bbbb7d72ccd" 00:26:22.093 ], 00:26:22.093 "product_name": "Malloc disk", 00:26:22.093 "block_size": 4096, 00:26:22.093 "num_blocks": 8192, 00:26:22.093 "uuid": "28af12b1-500e-4e72-9e16-7bbbb7d72ccd", 00:26:22.093 "assigned_rate_limits": { 00:26:22.093 "rw_ios_per_sec": 0, 00:26:22.093 "rw_mbytes_per_sec": 0, 00:26:22.093 "r_mbytes_per_sec": 0, 00:26:22.093 "w_mbytes_per_sec": 0 00:26:22.093 }, 00:26:22.093 "claimed": true, 00:26:22.093 "claim_type": "exclusive_write", 00:26:22.093 "zoned": false, 00:26:22.093 "supported_io_types": { 00:26:22.093 "read": true, 00:26:22.093 "write": true, 00:26:22.093 "unmap": true, 00:26:22.093 "flush": true, 00:26:22.093 "reset": true, 00:26:22.093 "nvme_admin": false, 00:26:22.093 "nvme_io": false, 00:26:22.093 "nvme_io_md": false, 00:26:22.093 "write_zeroes": true, 00:26:22.093 "zcopy": true, 00:26:22.093 "get_zone_info": false, 00:26:22.093 "zone_management": false, 00:26:22.093 "zone_append": false, 00:26:22.093 "compare": false, 00:26:22.093 "compare_and_write": false, 00:26:22.093 "abort": true, 00:26:22.093 "seek_hole": false, 00:26:22.093 "seek_data": false, 00:26:22.093 "copy": true, 00:26:22.093 "nvme_iov_md": false 00:26:22.093 }, 00:26:22.093 "memory_domains": [ 00:26:22.093 { 00:26:22.093 "dma_device_id": "system", 00:26:22.093 "dma_device_type": 1 00:26:22.093 }, 00:26:22.093 { 00:26:22.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:22.093 "dma_device_type": 2 00:26:22.093 } 00:26:22.093 ], 00:26:22.093 "driver_specific": {} 00:26:22.093 }' 00:26:22.093 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:22.093 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:22.093 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:22.351 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:22.610 [2024-07-12 10:52:57.680139] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.610 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:22.868 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.868 "name": "Existed_Raid", 00:26:22.868 "uuid": "2db05f78-8253-46f2-8c5c-ab92323304e1", 00:26:22.868 "strip_size_kb": 0, 00:26:22.868 "state": "online", 00:26:22.868 "raid_level": "raid1", 00:26:22.868 "superblock": true, 00:26:22.868 "num_base_bdevs": 2, 00:26:22.868 "num_base_bdevs_discovered": 1, 00:26:22.868 "num_base_bdevs_operational": 1, 00:26:22.868 "base_bdevs_list": [ 00:26:22.868 { 00:26:22.868 "name": null, 00:26:22.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.868 "is_configured": false, 00:26:22.868 "data_offset": 256, 00:26:22.868 "data_size": 7936 00:26:22.868 }, 00:26:22.868 { 00:26:22.868 "name": "BaseBdev2", 00:26:22.868 "uuid": "28af12b1-500e-4e72-9e16-7bbbb7d72ccd", 00:26:22.868 "is_configured": true, 00:26:22.868 "data_offset": 256, 00:26:22.868 "data_size": 7936 00:26:22.868 } 00:26:22.868 ] 00:26:22.868 }' 00:26:22.868 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.868 10:52:57 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:23.435 10:52:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:23.435 10:52:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:23.435 10:52:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.435 10:52:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:23.693 10:52:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:23.693 10:52:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:23.693 10:52:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:23.951 [2024-07-12 10:52:58.976674] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:23.951 [2024-07-12 10:52:58.976758] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:23.951 [2024-07-12 10:52:58.987571] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:23.951 [2024-07-12 10:52:58.987605] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:23.951 [2024-07-12 10:52:58.987616] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x249c000 name Existed_Raid, state offline 00:26:23.951 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:23.951 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:23.951 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.951 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 2153344 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2153344 ']' 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2153344 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2153344 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2153344' 00:26:24.210 killing process with pid 2153344 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2153344 00:26:24.210 [2024-07-12 10:52:59.297090] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:24.210 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2153344 00:26:24.210 [2024-07-12 10:52:59.297958] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:24.469 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:26:24.469 00:26:24.469 real 0m10.242s 00:26:24.469 user 0m18.138s 00:26:24.469 sys 0m1.994s 00:26:24.469 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:24.469 10:52:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:24.469 ************************************ 00:26:24.469 END TEST raid_state_function_test_sb_4k 00:26:24.469 ************************************ 00:26:24.469 10:52:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:24.469 10:52:59 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:26:24.469 10:52:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:26:24.469 10:52:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:24.469 10:52:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:24.469 ************************************ 00:26:24.469 START TEST raid_superblock_test_4k 00:26:24.469 ************************************ 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:24.469 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=2154865 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 2154865 /var/tmp/spdk-raid.sock 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 2154865 ']' 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:24.470 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:24.470 10:52:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:24.470 [2024-07-12 10:52:59.645662] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:26:24.470 [2024-07-12 10:52:59.645728] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2154865 ] 00:26:24.728 [2024-07-12 10:52:59.773439] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:24.728 [2024-07-12 10:52:59.875635] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:24.987 [2024-07-12 10:52:59.935247] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:24.987 [2024-07-12 10:52:59.935286] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:25.554 10:53:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:25.554 10:53:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:26:25.554 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:26:25.554 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:25.554 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:26:25.554 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:26:25.554 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:25.555 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:25.555 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:25.555 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:25.555 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:26:25.813 malloc1 00:26:25.813 10:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:26.071 [2024-07-12 10:53:01.037034] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:26.071 [2024-07-12 10:53:01.037081] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:26.071 [2024-07-12 10:53:01.037103] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cdf570 00:26:26.071 [2024-07-12 10:53:01.037116] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:26.071 [2024-07-12 10:53:01.038856] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:26.071 [2024-07-12 10:53:01.038885] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:26.071 pt1 00:26:26.071 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:26.071 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:26.071 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:26:26.071 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:26:26.071 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:26.071 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:26.071 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:26.071 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:26.071 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:26:26.330 malloc2 00:26:26.330 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:26.589 [2024-07-12 10:53:01.531123] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:26.589 [2024-07-12 10:53:01.531169] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:26.589 [2024-07-12 10:53:01.531187] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce0970 00:26:26.589 [2024-07-12 10:53:01.531199] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:26.589 [2024-07-12 10:53:01.532814] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:26.589 [2024-07-12 10:53:01.532843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:26.589 pt2 00:26:26.589 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:26.589 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:26.589 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:26:26.589 [2024-07-12 10:53:01.763749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:26.589 [2024-07-12 10:53:01.765094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:26.589 [2024-07-12 10:53:01.765253] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e83270 00:26:26.589 [2024-07-12 10:53:01.765267] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:26.589 [2024-07-12 10:53:01.765460] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd70e0 00:26:26.589 [2024-07-12 10:53:01.765612] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e83270 00:26:26.589 [2024-07-12 10:53:01.765624] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e83270 00:26:26.589 [2024-07-12 10:53:01.765726] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:26.589 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:26.589 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:26.589 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:26.589 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.589 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.847 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:26.847 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.847 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.847 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.847 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.847 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.847 10:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.847 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.847 "name": "raid_bdev1", 00:26:26.847 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:26.847 "strip_size_kb": 0, 00:26:26.847 "state": "online", 00:26:26.847 "raid_level": "raid1", 00:26:26.847 "superblock": true, 00:26:26.847 "num_base_bdevs": 2, 00:26:26.847 "num_base_bdevs_discovered": 2, 00:26:26.847 "num_base_bdevs_operational": 2, 00:26:26.847 "base_bdevs_list": [ 00:26:26.847 { 00:26:26.847 "name": "pt1", 00:26:26.847 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:26.847 "is_configured": true, 00:26:26.847 "data_offset": 256, 00:26:26.847 "data_size": 7936 00:26:26.847 }, 00:26:26.847 { 00:26:26.847 "name": "pt2", 00:26:26.847 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:26.847 "is_configured": true, 00:26:26.847 "data_offset": 256, 00:26:26.847 "data_size": 7936 00:26:26.847 } 00:26:26.847 ] 00:26:26.847 }' 00:26:26.847 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.847 10:53:02 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:27.414 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:27.414 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:27.414 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:27.414 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:27.414 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:27.414 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:27.672 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:27.672 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:27.672 [2024-07-12 10:53:02.834813] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:27.672 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:27.672 "name": "raid_bdev1", 00:26:27.672 "aliases": [ 00:26:27.672 "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14" 00:26:27.672 ], 00:26:27.672 "product_name": "Raid Volume", 00:26:27.672 "block_size": 4096, 00:26:27.672 "num_blocks": 7936, 00:26:27.672 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:27.672 "assigned_rate_limits": { 00:26:27.672 "rw_ios_per_sec": 0, 00:26:27.672 "rw_mbytes_per_sec": 0, 00:26:27.672 "r_mbytes_per_sec": 0, 00:26:27.672 "w_mbytes_per_sec": 0 00:26:27.672 }, 00:26:27.672 "claimed": false, 00:26:27.672 "zoned": false, 00:26:27.672 "supported_io_types": { 00:26:27.672 "read": true, 00:26:27.672 "write": true, 00:26:27.672 "unmap": false, 00:26:27.672 "flush": false, 00:26:27.672 "reset": true, 00:26:27.672 "nvme_admin": false, 00:26:27.672 "nvme_io": false, 00:26:27.672 "nvme_io_md": false, 00:26:27.672 "write_zeroes": true, 00:26:27.672 "zcopy": false, 00:26:27.672 "get_zone_info": false, 00:26:27.672 "zone_management": false, 00:26:27.672 "zone_append": false, 00:26:27.672 "compare": false, 00:26:27.672 "compare_and_write": false, 00:26:27.672 "abort": false, 00:26:27.672 "seek_hole": false, 00:26:27.672 "seek_data": false, 00:26:27.672 "copy": false, 00:26:27.672 "nvme_iov_md": false 00:26:27.672 }, 00:26:27.672 "memory_domains": [ 00:26:27.672 { 00:26:27.672 "dma_device_id": "system", 00:26:27.672 "dma_device_type": 1 00:26:27.672 }, 00:26:27.672 { 00:26:27.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:27.672 "dma_device_type": 2 00:26:27.672 }, 00:26:27.672 { 00:26:27.672 "dma_device_id": "system", 00:26:27.672 "dma_device_type": 1 00:26:27.672 }, 00:26:27.672 { 00:26:27.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:27.672 "dma_device_type": 2 00:26:27.672 } 00:26:27.672 ], 00:26:27.672 "driver_specific": { 00:26:27.672 "raid": { 00:26:27.672 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:27.672 "strip_size_kb": 0, 00:26:27.672 "state": "online", 00:26:27.672 "raid_level": "raid1", 00:26:27.672 "superblock": true, 00:26:27.672 "num_base_bdevs": 2, 00:26:27.672 "num_base_bdevs_discovered": 2, 00:26:27.672 "num_base_bdevs_operational": 2, 00:26:27.672 "base_bdevs_list": [ 00:26:27.672 { 00:26:27.672 "name": "pt1", 00:26:27.672 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:27.672 "is_configured": true, 00:26:27.672 "data_offset": 256, 00:26:27.672 "data_size": 7936 00:26:27.672 }, 00:26:27.672 { 00:26:27.672 "name": "pt2", 00:26:27.672 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:27.672 "is_configured": true, 00:26:27.672 "data_offset": 256, 00:26:27.672 "data_size": 7936 00:26:27.672 } 00:26:27.672 ] 00:26:27.672 } 00:26:27.672 } 00:26:27.672 }' 00:26:27.672 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:27.937 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:27.937 pt2' 00:26:27.937 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:27.937 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:27.937 10:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:28.202 "name": "pt1", 00:26:28.202 "aliases": [ 00:26:28.202 "00000000-0000-0000-0000-000000000001" 00:26:28.202 ], 00:26:28.202 "product_name": "passthru", 00:26:28.202 "block_size": 4096, 00:26:28.202 "num_blocks": 8192, 00:26:28.202 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:28.202 "assigned_rate_limits": { 00:26:28.202 "rw_ios_per_sec": 0, 00:26:28.202 "rw_mbytes_per_sec": 0, 00:26:28.202 "r_mbytes_per_sec": 0, 00:26:28.202 "w_mbytes_per_sec": 0 00:26:28.202 }, 00:26:28.202 "claimed": true, 00:26:28.202 "claim_type": "exclusive_write", 00:26:28.202 "zoned": false, 00:26:28.202 "supported_io_types": { 00:26:28.202 "read": true, 00:26:28.202 "write": true, 00:26:28.202 "unmap": true, 00:26:28.202 "flush": true, 00:26:28.202 "reset": true, 00:26:28.202 "nvme_admin": false, 00:26:28.202 "nvme_io": false, 00:26:28.202 "nvme_io_md": false, 00:26:28.202 "write_zeroes": true, 00:26:28.202 "zcopy": true, 00:26:28.202 "get_zone_info": false, 00:26:28.202 "zone_management": false, 00:26:28.202 "zone_append": false, 00:26:28.202 "compare": false, 00:26:28.202 "compare_and_write": false, 00:26:28.202 "abort": true, 00:26:28.202 "seek_hole": false, 00:26:28.202 "seek_data": false, 00:26:28.202 "copy": true, 00:26:28.202 "nvme_iov_md": false 00:26:28.202 }, 00:26:28.202 "memory_domains": [ 00:26:28.202 { 00:26:28.202 "dma_device_id": "system", 00:26:28.202 "dma_device_type": 1 00:26:28.202 }, 00:26:28.202 { 00:26:28.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:28.202 "dma_device_type": 2 00:26:28.202 } 00:26:28.202 ], 00:26:28.202 "driver_specific": { 00:26:28.202 "passthru": { 00:26:28.202 "name": "pt1", 00:26:28.202 "base_bdev_name": "malloc1" 00:26:28.202 } 00:26:28.202 } 00:26:28.202 }' 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:28.202 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:28.460 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:28.460 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:28.460 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:28.460 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:28.460 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:28.460 "name": "pt2", 00:26:28.460 "aliases": [ 00:26:28.460 "00000000-0000-0000-0000-000000000002" 00:26:28.460 ], 00:26:28.460 "product_name": "passthru", 00:26:28.460 "block_size": 4096, 00:26:28.460 "num_blocks": 8192, 00:26:28.460 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:28.460 "assigned_rate_limits": { 00:26:28.460 "rw_ios_per_sec": 0, 00:26:28.460 "rw_mbytes_per_sec": 0, 00:26:28.460 "r_mbytes_per_sec": 0, 00:26:28.460 "w_mbytes_per_sec": 0 00:26:28.460 }, 00:26:28.460 "claimed": true, 00:26:28.460 "claim_type": "exclusive_write", 00:26:28.460 "zoned": false, 00:26:28.460 "supported_io_types": { 00:26:28.460 "read": true, 00:26:28.460 "write": true, 00:26:28.460 "unmap": true, 00:26:28.460 "flush": true, 00:26:28.460 "reset": true, 00:26:28.460 "nvme_admin": false, 00:26:28.460 "nvme_io": false, 00:26:28.460 "nvme_io_md": false, 00:26:28.460 "write_zeroes": true, 00:26:28.460 "zcopy": true, 00:26:28.460 "get_zone_info": false, 00:26:28.460 "zone_management": false, 00:26:28.460 "zone_append": false, 00:26:28.460 "compare": false, 00:26:28.460 "compare_and_write": false, 00:26:28.460 "abort": true, 00:26:28.460 "seek_hole": false, 00:26:28.460 "seek_data": false, 00:26:28.460 "copy": true, 00:26:28.460 "nvme_iov_md": false 00:26:28.460 }, 00:26:28.460 "memory_domains": [ 00:26:28.460 { 00:26:28.460 "dma_device_id": "system", 00:26:28.460 "dma_device_type": 1 00:26:28.460 }, 00:26:28.460 { 00:26:28.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:28.460 "dma_device_type": 2 00:26:28.460 } 00:26:28.460 ], 00:26:28.460 "driver_specific": { 00:26:28.460 "passthru": { 00:26:28.460 "name": "pt2", 00:26:28.460 "base_bdev_name": "malloc2" 00:26:28.460 } 00:26:28.460 } 00:26:28.460 }' 00:26:28.460 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:28.718 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:28.718 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:28.718 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:28.718 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:28.718 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:28.718 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:28.718 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:28.718 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:28.718 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:28.976 10:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:28.976 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:28.976 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:28.976 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:29.234 [2024-07-12 10:53:04.230516] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:29.234 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14 00:26:29.234 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14 ']' 00:26:29.234 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:29.492 [2024-07-12 10:53:04.474899] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:29.492 [2024-07-12 10:53:04.474924] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:29.492 [2024-07-12 10:53:04.474982] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:29.492 [2024-07-12 10:53:04.475038] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:29.492 [2024-07-12 10:53:04.475050] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e83270 name raid_bdev1, state offline 00:26:29.492 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.492 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:29.750 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:29.751 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:29.751 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:29.751 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:30.008 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:30.008 10:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:30.266 10:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:30.266 10:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:30.523 10:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:30.523 10:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:30.523 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:26:30.523 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:30.523 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:30.523 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:30.523 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:30.524 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:30.524 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:30.524 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:30.524 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:30.524 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:30.524 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:30.524 [2024-07-12 10:53:05.706117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:30.524 [2024-07-12 10:53:05.707459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:30.524 [2024-07-12 10:53:05.707524] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:30.524 [2024-07-12 10:53:05.707566] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:30.524 [2024-07-12 10:53:05.707585] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:30.524 [2024-07-12 10:53:05.707594] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e82ff0 name raid_bdev1, state configuring 00:26:30.524 request: 00:26:30.524 { 00:26:30.524 "name": "raid_bdev1", 00:26:30.524 "raid_level": "raid1", 00:26:30.524 "base_bdevs": [ 00:26:30.524 "malloc1", 00:26:30.524 "malloc2" 00:26:30.524 ], 00:26:30.524 "superblock": false, 00:26:30.524 "method": "bdev_raid_create", 00:26:30.524 "req_id": 1 00:26:30.524 } 00:26:30.524 Got JSON-RPC error response 00:26:30.524 response: 00:26:30.524 { 00:26:30.524 "code": -17, 00:26:30.524 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:30.524 } 00:26:30.781 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:26:30.781 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:30.781 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:30.781 10:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:30.782 10:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:30.782 10:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.782 10:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:30.782 10:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:30.782 10:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:31.040 [2024-07-12 10:53:06.191343] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:31.040 [2024-07-12 10:53:06.191394] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:31.040 [2024-07-12 10:53:06.191422] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cdf7a0 00:26:31.040 [2024-07-12 10:53:06.191435] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:31.040 [2024-07-12 10:53:06.193029] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:31.040 [2024-07-12 10:53:06.193057] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:31.040 [2024-07-12 10:53:06.193125] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:31.040 [2024-07-12 10:53:06.193150] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:31.040 pt1 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.040 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.298 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:31.299 "name": "raid_bdev1", 00:26:31.299 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:31.299 "strip_size_kb": 0, 00:26:31.299 "state": "configuring", 00:26:31.299 "raid_level": "raid1", 00:26:31.299 "superblock": true, 00:26:31.299 "num_base_bdevs": 2, 00:26:31.299 "num_base_bdevs_discovered": 1, 00:26:31.299 "num_base_bdevs_operational": 2, 00:26:31.299 "base_bdevs_list": [ 00:26:31.299 { 00:26:31.299 "name": "pt1", 00:26:31.299 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:31.299 "is_configured": true, 00:26:31.299 "data_offset": 256, 00:26:31.299 "data_size": 7936 00:26:31.299 }, 00:26:31.299 { 00:26:31.299 "name": null, 00:26:31.299 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:31.299 "is_configured": false, 00:26:31.299 "data_offset": 256, 00:26:31.299 "data_size": 7936 00:26:31.299 } 00:26:31.299 ] 00:26:31.299 }' 00:26:31.299 10:53:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:31.299 10:53:06 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:31.865 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:26:31.865 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:31.865 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:31.865 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:32.123 [2024-07-12 10:53:07.185982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:32.123 [2024-07-12 10:53:07.186032] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:32.123 [2024-07-12 10:53:07.186052] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e776f0 00:26:32.124 [2024-07-12 10:53:07.186064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:32.124 [2024-07-12 10:53:07.186408] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:32.124 [2024-07-12 10:53:07.186426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:32.124 [2024-07-12 10:53:07.186500] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:32.124 [2024-07-12 10:53:07.186524] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:32.124 [2024-07-12 10:53:07.186625] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e78590 00:26:32.124 [2024-07-12 10:53:07.186635] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:32.124 [2024-07-12 10:53:07.186801] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd9540 00:26:32.124 [2024-07-12 10:53:07.186922] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e78590 00:26:32.124 [2024-07-12 10:53:07.186932] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e78590 00:26:32.124 [2024-07-12 10:53:07.187031] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:32.124 pt2 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.124 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.382 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:32.382 "name": "raid_bdev1", 00:26:32.382 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:32.382 "strip_size_kb": 0, 00:26:32.382 "state": "online", 00:26:32.382 "raid_level": "raid1", 00:26:32.382 "superblock": true, 00:26:32.382 "num_base_bdevs": 2, 00:26:32.382 "num_base_bdevs_discovered": 2, 00:26:32.382 "num_base_bdevs_operational": 2, 00:26:32.382 "base_bdevs_list": [ 00:26:32.382 { 00:26:32.382 "name": "pt1", 00:26:32.382 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:32.382 "is_configured": true, 00:26:32.382 "data_offset": 256, 00:26:32.382 "data_size": 7936 00:26:32.382 }, 00:26:32.382 { 00:26:32.382 "name": "pt2", 00:26:32.382 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:32.382 "is_configured": true, 00:26:32.382 "data_offset": 256, 00:26:32.382 "data_size": 7936 00:26:32.382 } 00:26:32.382 ] 00:26:32.382 }' 00:26:32.382 10:53:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:32.382 10:53:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:32.949 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:32.949 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:32.949 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:32.949 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:32.949 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:32.949 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:32.949 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:32.949 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:33.207 [2024-07-12 10:53:08.289178] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:33.207 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:33.207 "name": "raid_bdev1", 00:26:33.207 "aliases": [ 00:26:33.207 "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14" 00:26:33.207 ], 00:26:33.207 "product_name": "Raid Volume", 00:26:33.207 "block_size": 4096, 00:26:33.207 "num_blocks": 7936, 00:26:33.207 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:33.207 "assigned_rate_limits": { 00:26:33.208 "rw_ios_per_sec": 0, 00:26:33.208 "rw_mbytes_per_sec": 0, 00:26:33.208 "r_mbytes_per_sec": 0, 00:26:33.208 "w_mbytes_per_sec": 0 00:26:33.208 }, 00:26:33.208 "claimed": false, 00:26:33.208 "zoned": false, 00:26:33.208 "supported_io_types": { 00:26:33.208 "read": true, 00:26:33.208 "write": true, 00:26:33.208 "unmap": false, 00:26:33.208 "flush": false, 00:26:33.208 "reset": true, 00:26:33.208 "nvme_admin": false, 00:26:33.208 "nvme_io": false, 00:26:33.208 "nvme_io_md": false, 00:26:33.208 "write_zeroes": true, 00:26:33.208 "zcopy": false, 00:26:33.208 "get_zone_info": false, 00:26:33.208 "zone_management": false, 00:26:33.208 "zone_append": false, 00:26:33.208 "compare": false, 00:26:33.208 "compare_and_write": false, 00:26:33.208 "abort": false, 00:26:33.208 "seek_hole": false, 00:26:33.208 "seek_data": false, 00:26:33.208 "copy": false, 00:26:33.208 "nvme_iov_md": false 00:26:33.208 }, 00:26:33.208 "memory_domains": [ 00:26:33.208 { 00:26:33.208 "dma_device_id": "system", 00:26:33.208 "dma_device_type": 1 00:26:33.208 }, 00:26:33.208 { 00:26:33.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:33.208 "dma_device_type": 2 00:26:33.208 }, 00:26:33.208 { 00:26:33.208 "dma_device_id": "system", 00:26:33.208 "dma_device_type": 1 00:26:33.208 }, 00:26:33.208 { 00:26:33.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:33.208 "dma_device_type": 2 00:26:33.208 } 00:26:33.208 ], 00:26:33.208 "driver_specific": { 00:26:33.208 "raid": { 00:26:33.208 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:33.208 "strip_size_kb": 0, 00:26:33.208 "state": "online", 00:26:33.208 "raid_level": "raid1", 00:26:33.208 "superblock": true, 00:26:33.208 "num_base_bdevs": 2, 00:26:33.208 "num_base_bdevs_discovered": 2, 00:26:33.208 "num_base_bdevs_operational": 2, 00:26:33.208 "base_bdevs_list": [ 00:26:33.208 { 00:26:33.208 "name": "pt1", 00:26:33.208 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:33.208 "is_configured": true, 00:26:33.208 "data_offset": 256, 00:26:33.208 "data_size": 7936 00:26:33.208 }, 00:26:33.208 { 00:26:33.208 "name": "pt2", 00:26:33.208 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:33.208 "is_configured": true, 00:26:33.208 "data_offset": 256, 00:26:33.208 "data_size": 7936 00:26:33.208 } 00:26:33.208 ] 00:26:33.208 } 00:26:33.208 } 00:26:33.208 }' 00:26:33.208 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:33.208 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:33.208 pt2' 00:26:33.208 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:33.208 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:33.208 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:33.466 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:33.466 "name": "pt1", 00:26:33.466 "aliases": [ 00:26:33.466 "00000000-0000-0000-0000-000000000001" 00:26:33.466 ], 00:26:33.466 "product_name": "passthru", 00:26:33.466 "block_size": 4096, 00:26:33.466 "num_blocks": 8192, 00:26:33.466 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:33.466 "assigned_rate_limits": { 00:26:33.466 "rw_ios_per_sec": 0, 00:26:33.466 "rw_mbytes_per_sec": 0, 00:26:33.466 "r_mbytes_per_sec": 0, 00:26:33.466 "w_mbytes_per_sec": 0 00:26:33.466 }, 00:26:33.466 "claimed": true, 00:26:33.466 "claim_type": "exclusive_write", 00:26:33.466 "zoned": false, 00:26:33.466 "supported_io_types": { 00:26:33.466 "read": true, 00:26:33.466 "write": true, 00:26:33.466 "unmap": true, 00:26:33.466 "flush": true, 00:26:33.466 "reset": true, 00:26:33.466 "nvme_admin": false, 00:26:33.466 "nvme_io": false, 00:26:33.466 "nvme_io_md": false, 00:26:33.466 "write_zeroes": true, 00:26:33.466 "zcopy": true, 00:26:33.466 "get_zone_info": false, 00:26:33.466 "zone_management": false, 00:26:33.466 "zone_append": false, 00:26:33.466 "compare": false, 00:26:33.466 "compare_and_write": false, 00:26:33.466 "abort": true, 00:26:33.466 "seek_hole": false, 00:26:33.466 "seek_data": false, 00:26:33.466 "copy": true, 00:26:33.466 "nvme_iov_md": false 00:26:33.466 }, 00:26:33.466 "memory_domains": [ 00:26:33.466 { 00:26:33.466 "dma_device_id": "system", 00:26:33.466 "dma_device_type": 1 00:26:33.466 }, 00:26:33.466 { 00:26:33.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:33.466 "dma_device_type": 2 00:26:33.466 } 00:26:33.466 ], 00:26:33.466 "driver_specific": { 00:26:33.466 "passthru": { 00:26:33.466 "name": "pt1", 00:26:33.466 "base_bdev_name": "malloc1" 00:26:33.466 } 00:26:33.466 } 00:26:33.466 }' 00:26:33.466 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:33.466 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:33.723 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:33.723 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:33.723 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:33.724 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:33.724 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:33.724 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:33.724 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:33.724 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:33.724 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:33.981 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:33.981 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:33.981 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:33.981 10:53:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:34.239 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:34.239 "name": "pt2", 00:26:34.239 "aliases": [ 00:26:34.239 "00000000-0000-0000-0000-000000000002" 00:26:34.239 ], 00:26:34.239 "product_name": "passthru", 00:26:34.239 "block_size": 4096, 00:26:34.239 "num_blocks": 8192, 00:26:34.239 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:34.239 "assigned_rate_limits": { 00:26:34.239 "rw_ios_per_sec": 0, 00:26:34.239 "rw_mbytes_per_sec": 0, 00:26:34.239 "r_mbytes_per_sec": 0, 00:26:34.239 "w_mbytes_per_sec": 0 00:26:34.239 }, 00:26:34.239 "claimed": true, 00:26:34.239 "claim_type": "exclusive_write", 00:26:34.239 "zoned": false, 00:26:34.239 "supported_io_types": { 00:26:34.239 "read": true, 00:26:34.239 "write": true, 00:26:34.239 "unmap": true, 00:26:34.239 "flush": true, 00:26:34.239 "reset": true, 00:26:34.239 "nvme_admin": false, 00:26:34.239 "nvme_io": false, 00:26:34.239 "nvme_io_md": false, 00:26:34.239 "write_zeroes": true, 00:26:34.239 "zcopy": true, 00:26:34.239 "get_zone_info": false, 00:26:34.239 "zone_management": false, 00:26:34.239 "zone_append": false, 00:26:34.239 "compare": false, 00:26:34.239 "compare_and_write": false, 00:26:34.239 "abort": true, 00:26:34.239 "seek_hole": false, 00:26:34.239 "seek_data": false, 00:26:34.239 "copy": true, 00:26:34.239 "nvme_iov_md": false 00:26:34.239 }, 00:26:34.239 "memory_domains": [ 00:26:34.239 { 00:26:34.239 "dma_device_id": "system", 00:26:34.239 "dma_device_type": 1 00:26:34.239 }, 00:26:34.239 { 00:26:34.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:34.239 "dma_device_type": 2 00:26:34.239 } 00:26:34.239 ], 00:26:34.239 "driver_specific": { 00:26:34.239 "passthru": { 00:26:34.239 "name": "pt2", 00:26:34.239 "base_bdev_name": "malloc2" 00:26:34.239 } 00:26:34.239 } 00:26:34.239 }' 00:26:34.239 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:34.240 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:34.240 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:34.240 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:34.240 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:34.240 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:34.240 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:34.240 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:34.497 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:34.497 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:34.497 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:34.497 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:34.497 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:34.497 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:34.755 [2024-07-12 10:53:09.769129] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:34.755 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14 '!=' 0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14 ']' 00:26:34.755 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:26:34.755 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:34.755 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:34.755 10:53:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:35.013 [2024-07-12 10:53:10.013559] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.013 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.271 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:35.271 "name": "raid_bdev1", 00:26:35.271 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:35.271 "strip_size_kb": 0, 00:26:35.271 "state": "online", 00:26:35.271 "raid_level": "raid1", 00:26:35.271 "superblock": true, 00:26:35.271 "num_base_bdevs": 2, 00:26:35.271 "num_base_bdevs_discovered": 1, 00:26:35.271 "num_base_bdevs_operational": 1, 00:26:35.271 "base_bdevs_list": [ 00:26:35.271 { 00:26:35.271 "name": null, 00:26:35.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:35.271 "is_configured": false, 00:26:35.271 "data_offset": 256, 00:26:35.271 "data_size": 7936 00:26:35.271 }, 00:26:35.271 { 00:26:35.271 "name": "pt2", 00:26:35.271 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:35.271 "is_configured": true, 00:26:35.271 "data_offset": 256, 00:26:35.271 "data_size": 7936 00:26:35.271 } 00:26:35.271 ] 00:26:35.271 }' 00:26:35.271 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:35.271 10:53:10 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:35.883 10:53:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:36.207 [2024-07-12 10:53:11.092386] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:36.207 [2024-07-12 10:53:11.092415] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:36.207 [2024-07-12 10:53:11.092472] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:36.207 [2024-07-12 10:53:11.092524] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:36.207 [2024-07-12 10:53:11.092538] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e78590 name raid_bdev1, state offline 00:26:36.207 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:26:36.207 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.207 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:26:36.207 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:26:36.207 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:26:36.207 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:36.207 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:36.466 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:26:36.466 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:36.466 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:26:36.466 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:26:36.466 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:26:36.466 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:36.725 [2024-07-12 10:53:11.842344] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:36.725 [2024-07-12 10:53:11.842397] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:36.725 [2024-07-12 10:53:11.842416] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce0160 00:26:36.725 [2024-07-12 10:53:11.842429] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:36.725 [2024-07-12 10:53:11.844024] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:36.725 [2024-07-12 10:53:11.844053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:36.725 [2024-07-12 10:53:11.844120] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:36.725 [2024-07-12 10:53:11.844146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:36.725 [2024-07-12 10:53:11.844228] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cd6380 00:26:36.725 [2024-07-12 10:53:11.844239] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:36.725 [2024-07-12 10:53:11.844408] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd7a80 00:26:36.725 [2024-07-12 10:53:11.844535] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cd6380 00:26:36.725 [2024-07-12 10:53:11.844546] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cd6380 00:26:36.725 [2024-07-12 10:53:11.844643] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:36.725 pt2 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.725 10:53:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.984 10:53:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:36.984 "name": "raid_bdev1", 00:26:36.984 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:36.984 "strip_size_kb": 0, 00:26:36.984 "state": "online", 00:26:36.984 "raid_level": "raid1", 00:26:36.984 "superblock": true, 00:26:36.984 "num_base_bdevs": 2, 00:26:36.984 "num_base_bdevs_discovered": 1, 00:26:36.984 "num_base_bdevs_operational": 1, 00:26:36.984 "base_bdevs_list": [ 00:26:36.984 { 00:26:36.984 "name": null, 00:26:36.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.984 "is_configured": false, 00:26:36.984 "data_offset": 256, 00:26:36.984 "data_size": 7936 00:26:36.984 }, 00:26:36.984 { 00:26:36.984 "name": "pt2", 00:26:36.984 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:36.984 "is_configured": true, 00:26:36.984 "data_offset": 256, 00:26:36.984 "data_size": 7936 00:26:36.984 } 00:26:36.984 ] 00:26:36.984 }' 00:26:36.984 10:53:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:36.984 10:53:12 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:37.553 10:53:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:37.812 [2024-07-12 10:53:12.925186] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:37.812 [2024-07-12 10:53:12.925213] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:37.812 [2024-07-12 10:53:12.925267] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:37.812 [2024-07-12 10:53:12.925309] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:37.812 [2024-07-12 10:53:12.925320] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cd6380 name raid_bdev1, state offline 00:26:37.812 10:53:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.812 10:53:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:26:38.069 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:26:38.069 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:26:38.069 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:26:38.069 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:38.327 [2024-07-12 10:53:13.286132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:38.327 [2024-07-12 10:53:13.286183] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:38.327 [2024-07-12 10:53:13.286203] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e82520 00:26:38.327 [2024-07-12 10:53:13.286215] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:38.327 [2024-07-12 10:53:13.287810] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:38.327 [2024-07-12 10:53:13.287839] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:38.327 [2024-07-12 10:53:13.287904] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:38.327 [2024-07-12 10:53:13.287928] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:38.327 [2024-07-12 10:53:13.288028] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:38.327 [2024-07-12 10:53:13.288041] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:38.327 [2024-07-12 10:53:13.288054] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cd73f0 name raid_bdev1, state configuring 00:26:38.327 [2024-07-12 10:53:13.288077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:38.327 [2024-07-12 10:53:13.288135] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cd92b0 00:26:38.327 [2024-07-12 10:53:13.288145] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:38.327 [2024-07-12 10:53:13.288310] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd6350 00:26:38.327 [2024-07-12 10:53:13.288428] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cd92b0 00:26:38.327 [2024-07-12 10:53:13.288438] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cd92b0 00:26:38.327 [2024-07-12 10:53:13.288548] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:38.327 pt1 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.327 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.585 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:38.585 "name": "raid_bdev1", 00:26:38.585 "uuid": "0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14", 00:26:38.585 "strip_size_kb": 0, 00:26:38.585 "state": "online", 00:26:38.585 "raid_level": "raid1", 00:26:38.585 "superblock": true, 00:26:38.585 "num_base_bdevs": 2, 00:26:38.585 "num_base_bdevs_discovered": 1, 00:26:38.585 "num_base_bdevs_operational": 1, 00:26:38.585 "base_bdevs_list": [ 00:26:38.585 { 00:26:38.585 "name": null, 00:26:38.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.585 "is_configured": false, 00:26:38.585 "data_offset": 256, 00:26:38.585 "data_size": 7936 00:26:38.585 }, 00:26:38.585 { 00:26:38.585 "name": "pt2", 00:26:38.586 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:38.586 "is_configured": true, 00:26:38.586 "data_offset": 256, 00:26:38.586 "data_size": 7936 00:26:38.586 } 00:26:38.586 ] 00:26:38.586 }' 00:26:38.586 10:53:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:38.586 10:53:13 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:39.153 10:53:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:39.153 10:53:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:39.411 10:53:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:26:39.411 10:53:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:39.411 10:53:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:26:39.670 [2024-07-12 10:53:14.613959] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14 '!=' 0a0c63c7-7e91-4d69-b7fc-e38c3bb0df14 ']' 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 2154865 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 2154865 ']' 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 2154865 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2154865 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2154865' 00:26:39.670 killing process with pid 2154865 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 2154865 00:26:39.670 [2024-07-12 10:53:14.684133] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:39.670 [2024-07-12 10:53:14.684186] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:39.670 [2024-07-12 10:53:14.684226] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:39.670 [2024-07-12 10:53:14.684238] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cd92b0 name raid_bdev1, state offline 00:26:39.670 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 2154865 00:26:39.670 [2024-07-12 10:53:14.703547] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:39.930 10:53:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:26:39.930 00:26:39.930 real 0m15.344s 00:26:39.930 user 0m27.818s 00:26:39.930 sys 0m2.840s 00:26:39.930 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:39.930 10:53:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:39.930 ************************************ 00:26:39.930 END TEST raid_superblock_test_4k 00:26:39.930 ************************************ 00:26:39.930 10:53:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:39.930 10:53:14 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:26:39.930 10:53:14 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:26:39.930 10:53:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:39.930 10:53:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:39.930 10:53:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:39.930 ************************************ 00:26:39.930 START TEST raid_rebuild_test_sb_4k 00:26:39.930 ************************************ 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=2157214 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 2157214 /var/tmp/spdk-raid.sock 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2157214 ']' 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:39.930 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:39.930 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:39.930 [2024-07-12 10:53:15.056274] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:26:39.930 [2024-07-12 10:53:15.056319] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2157214 ] 00:26:39.930 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:39.930 Zero copy mechanism will not be used. 00:26:40.189 [2024-07-12 10:53:15.170600] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:40.189 [2024-07-12 10:53:15.276838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:40.189 [2024-07-12 10:53:15.329252] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:40.189 [2024-07-12 10:53:15.329280] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:40.448 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:40.448 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:40.448 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:40.448 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:26:40.708 BaseBdev1_malloc 00:26:40.708 10:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:40.967 [2024-07-12 10:53:16.017969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:40.967 [2024-07-12 10:53:16.018018] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:40.967 [2024-07-12 10:53:16.018043] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10d2d40 00:26:40.967 [2024-07-12 10:53:16.018057] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:40.967 [2024-07-12 10:53:16.019848] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:40.967 [2024-07-12 10:53:16.019882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:40.967 BaseBdev1 00:26:40.967 10:53:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:40.967 10:53:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:26:41.226 BaseBdev2_malloc 00:26:41.226 10:53:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:41.485 [2024-07-12 10:53:16.505411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:41.485 [2024-07-12 10:53:16.505458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:41.485 [2024-07-12 10:53:16.505489] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10d3860 00:26:41.485 [2024-07-12 10:53:16.505502] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:41.485 [2024-07-12 10:53:16.507100] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:41.485 [2024-07-12 10:53:16.507128] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:41.485 BaseBdev2 00:26:41.485 10:53:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:26:41.745 spare_malloc 00:26:41.745 10:53:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:42.004 spare_delay 00:26:42.004 10:53:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:42.261 [2024-07-12 10:53:17.225111] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:42.261 [2024-07-12 10:53:17.225158] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:42.261 [2024-07-12 10:53:17.225179] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1281ec0 00:26:42.261 [2024-07-12 10:53:17.225192] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:42.261 [2024-07-12 10:53:17.226777] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:42.261 [2024-07-12 10:53:17.226805] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:42.261 spare 00:26:42.261 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:42.519 [2024-07-12 10:53:17.469793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:42.519 [2024-07-12 10:53:17.471036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:42.520 [2024-07-12 10:53:17.471201] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1283070 00:26:42.520 [2024-07-12 10:53:17.471214] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:42.520 [2024-07-12 10:53:17.471403] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x127c490 00:26:42.520 [2024-07-12 10:53:17.471550] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1283070 00:26:42.520 [2024-07-12 10:53:17.471561] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1283070 00:26:42.520 [2024-07-12 10:53:17.471661] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.520 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.778 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.778 "name": "raid_bdev1", 00:26:42.778 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:42.778 "strip_size_kb": 0, 00:26:42.778 "state": "online", 00:26:42.778 "raid_level": "raid1", 00:26:42.778 "superblock": true, 00:26:42.778 "num_base_bdevs": 2, 00:26:42.778 "num_base_bdevs_discovered": 2, 00:26:42.778 "num_base_bdevs_operational": 2, 00:26:42.778 "base_bdevs_list": [ 00:26:42.778 { 00:26:42.778 "name": "BaseBdev1", 00:26:42.778 "uuid": "f5aa57d7-f4e8-5309-b4b0-5029ee087072", 00:26:42.778 "is_configured": true, 00:26:42.778 "data_offset": 256, 00:26:42.778 "data_size": 7936 00:26:42.778 }, 00:26:42.778 { 00:26:42.778 "name": "BaseBdev2", 00:26:42.778 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:42.778 "is_configured": true, 00:26:42.778 "data_offset": 256, 00:26:42.778 "data_size": 7936 00:26:42.778 } 00:26:42.778 ] 00:26:42.778 }' 00:26:42.778 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.778 10:53:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:43.344 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:43.344 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:43.602 [2024-07-12 10:53:18.584959] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:43.602 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:26:43.602 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.602 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:43.860 10:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:44.119 [2024-07-12 10:53:19.086098] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x127c490 00:26:44.119 /dev/nbd0 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:44.119 1+0 records in 00:26:44.119 1+0 records out 00:26:44.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257731 s, 15.9 MB/s 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:44.119 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:26:45.054 7936+0 records in 00:26:45.054 7936+0 records out 00:26:45.054 32505856 bytes (33 MB, 31 MiB) copied, 0.76586 s, 42.4 MB/s 00:26:45.054 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:45.054 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:45.054 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:45.054 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:45.054 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:45.054 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:45.054 10:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:45.054 [2024-07-12 10:53:20.183735] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:45.054 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:45.054 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:45.054 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:45.054 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:45.054 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:45.054 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:45.054 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:45.054 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:45.054 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:45.312 [2024-07-12 10:53:20.412364] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.312 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.570 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.571 "name": "raid_bdev1", 00:26:45.571 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:45.571 "strip_size_kb": 0, 00:26:45.571 "state": "online", 00:26:45.571 "raid_level": "raid1", 00:26:45.571 "superblock": true, 00:26:45.571 "num_base_bdevs": 2, 00:26:45.571 "num_base_bdevs_discovered": 1, 00:26:45.571 "num_base_bdevs_operational": 1, 00:26:45.571 "base_bdevs_list": [ 00:26:45.571 { 00:26:45.571 "name": null, 00:26:45.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.571 "is_configured": false, 00:26:45.571 "data_offset": 256, 00:26:45.571 "data_size": 7936 00:26:45.571 }, 00:26:45.571 { 00:26:45.571 "name": "BaseBdev2", 00:26:45.571 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:45.571 "is_configured": true, 00:26:45.571 "data_offset": 256, 00:26:45.571 "data_size": 7936 00:26:45.571 } 00:26:45.571 ] 00:26:45.571 }' 00:26:45.571 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.571 10:53:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:46.135 10:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:46.391 [2024-07-12 10:53:21.443110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:46.391 [2024-07-12 10:53:21.448094] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1282ce0 00:26:46.391 [2024-07-12 10:53:21.450299] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:46.391 10:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:47.322 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:47.322 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:47.322 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:47.322 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:47.322 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:47.322 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.322 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:47.578 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:47.578 "name": "raid_bdev1", 00:26:47.578 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:47.578 "strip_size_kb": 0, 00:26:47.578 "state": "online", 00:26:47.578 "raid_level": "raid1", 00:26:47.578 "superblock": true, 00:26:47.578 "num_base_bdevs": 2, 00:26:47.578 "num_base_bdevs_discovered": 2, 00:26:47.578 "num_base_bdevs_operational": 2, 00:26:47.578 "process": { 00:26:47.578 "type": "rebuild", 00:26:47.578 "target": "spare", 00:26:47.578 "progress": { 00:26:47.578 "blocks": 2816, 00:26:47.578 "percent": 35 00:26:47.578 } 00:26:47.578 }, 00:26:47.578 "base_bdevs_list": [ 00:26:47.578 { 00:26:47.578 "name": "spare", 00:26:47.578 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:26:47.578 "is_configured": true, 00:26:47.578 "data_offset": 256, 00:26:47.578 "data_size": 7936 00:26:47.578 }, 00:26:47.578 { 00:26:47.578 "name": "BaseBdev2", 00:26:47.578 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:47.578 "is_configured": true, 00:26:47.578 "data_offset": 256, 00:26:47.578 "data_size": 7936 00:26:47.578 } 00:26:47.578 ] 00:26:47.578 }' 00:26:47.578 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:47.578 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:47.578 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:47.578 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:47.579 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:47.836 [2024-07-12 10:53:22.900650] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:47.836 [2024-07-12 10:53:22.962129] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:47.836 [2024-07-12 10:53:22.962178] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:47.836 [2024-07-12 10:53:22.962193] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:47.836 [2024-07-12 10:53:22.962202] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.836 10:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.093 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:48.093 "name": "raid_bdev1", 00:26:48.093 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:48.093 "strip_size_kb": 0, 00:26:48.093 "state": "online", 00:26:48.093 "raid_level": "raid1", 00:26:48.093 "superblock": true, 00:26:48.093 "num_base_bdevs": 2, 00:26:48.093 "num_base_bdevs_discovered": 1, 00:26:48.093 "num_base_bdevs_operational": 1, 00:26:48.093 "base_bdevs_list": [ 00:26:48.093 { 00:26:48.093 "name": null, 00:26:48.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.093 "is_configured": false, 00:26:48.093 "data_offset": 256, 00:26:48.093 "data_size": 7936 00:26:48.093 }, 00:26:48.093 { 00:26:48.093 "name": "BaseBdev2", 00:26:48.093 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:48.093 "is_configured": true, 00:26:48.093 "data_offset": 256, 00:26:48.093 "data_size": 7936 00:26:48.093 } 00:26:48.093 ] 00:26:48.093 }' 00:26:48.093 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:48.093 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:48.706 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:48.706 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:48.706 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:48.706 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:48.706 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:48.706 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.706 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.706 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:48.706 "name": "raid_bdev1", 00:26:48.706 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:48.706 "strip_size_kb": 0, 00:26:48.706 "state": "online", 00:26:48.706 "raid_level": "raid1", 00:26:48.706 "superblock": true, 00:26:48.706 "num_base_bdevs": 2, 00:26:48.706 "num_base_bdevs_discovered": 1, 00:26:48.706 "num_base_bdevs_operational": 1, 00:26:48.706 "base_bdevs_list": [ 00:26:48.706 { 00:26:48.706 "name": null, 00:26:48.706 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.706 "is_configured": false, 00:26:48.706 "data_offset": 256, 00:26:48.706 "data_size": 7936 00:26:48.706 }, 00:26:48.706 { 00:26:48.706 "name": "BaseBdev2", 00:26:48.706 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:48.706 "is_configured": true, 00:26:48.706 "data_offset": 256, 00:26:48.706 "data_size": 7936 00:26:48.706 } 00:26:48.706 ] 00:26:48.706 }' 00:26:48.706 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:48.964 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:48.964 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:48.964 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:48.964 10:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:48.964 [2024-07-12 10:53:24.081421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:48.964 [2024-07-12 10:53:24.086384] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1282ce0 00:26:48.964 [2024-07-12 10:53:24.087851] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:48.964 10:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:50.333 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:50.333 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:50.333 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:50.333 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:50.333 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:50.333 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.333 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.333 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:50.333 "name": "raid_bdev1", 00:26:50.333 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:50.333 "strip_size_kb": 0, 00:26:50.333 "state": "online", 00:26:50.333 "raid_level": "raid1", 00:26:50.333 "superblock": true, 00:26:50.333 "num_base_bdevs": 2, 00:26:50.333 "num_base_bdevs_discovered": 2, 00:26:50.333 "num_base_bdevs_operational": 2, 00:26:50.333 "process": { 00:26:50.333 "type": "rebuild", 00:26:50.333 "target": "spare", 00:26:50.333 "progress": { 00:26:50.333 "blocks": 2816, 00:26:50.333 "percent": 35 00:26:50.333 } 00:26:50.333 }, 00:26:50.333 "base_bdevs_list": [ 00:26:50.333 { 00:26:50.333 "name": "spare", 00:26:50.333 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:26:50.334 "is_configured": true, 00:26:50.334 "data_offset": 256, 00:26:50.334 "data_size": 7936 00:26:50.334 }, 00:26:50.334 { 00:26:50.334 "name": "BaseBdev2", 00:26:50.334 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:50.334 "is_configured": true, 00:26:50.334 "data_offset": 256, 00:26:50.334 "data_size": 7936 00:26:50.334 } 00:26:50.334 ] 00:26:50.334 }' 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:50.334 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1001 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.334 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.590 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:50.590 "name": "raid_bdev1", 00:26:50.590 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:50.590 "strip_size_kb": 0, 00:26:50.590 "state": "online", 00:26:50.590 "raid_level": "raid1", 00:26:50.590 "superblock": true, 00:26:50.590 "num_base_bdevs": 2, 00:26:50.590 "num_base_bdevs_discovered": 2, 00:26:50.590 "num_base_bdevs_operational": 2, 00:26:50.590 "process": { 00:26:50.590 "type": "rebuild", 00:26:50.590 "target": "spare", 00:26:50.590 "progress": { 00:26:50.590 "blocks": 3840, 00:26:50.590 "percent": 48 00:26:50.590 } 00:26:50.590 }, 00:26:50.590 "base_bdevs_list": [ 00:26:50.590 { 00:26:50.590 "name": "spare", 00:26:50.590 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:26:50.590 "is_configured": true, 00:26:50.590 "data_offset": 256, 00:26:50.590 "data_size": 7936 00:26:50.590 }, 00:26:50.590 { 00:26:50.590 "name": "BaseBdev2", 00:26:50.590 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:50.590 "is_configured": true, 00:26:50.590 "data_offset": 256, 00:26:50.590 "data_size": 7936 00:26:50.590 } 00:26:50.590 ] 00:26:50.590 }' 00:26:50.590 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:50.590 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:50.590 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:50.590 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:50.591 10:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:51.959 "name": "raid_bdev1", 00:26:51.959 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:51.959 "strip_size_kb": 0, 00:26:51.959 "state": "online", 00:26:51.959 "raid_level": "raid1", 00:26:51.959 "superblock": true, 00:26:51.959 "num_base_bdevs": 2, 00:26:51.959 "num_base_bdevs_discovered": 2, 00:26:51.959 "num_base_bdevs_operational": 2, 00:26:51.959 "process": { 00:26:51.959 "type": "rebuild", 00:26:51.959 "target": "spare", 00:26:51.959 "progress": { 00:26:51.959 "blocks": 7168, 00:26:51.959 "percent": 90 00:26:51.959 } 00:26:51.959 }, 00:26:51.959 "base_bdevs_list": [ 00:26:51.959 { 00:26:51.959 "name": "spare", 00:26:51.959 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:26:51.959 "is_configured": true, 00:26:51.959 "data_offset": 256, 00:26:51.959 "data_size": 7936 00:26:51.959 }, 00:26:51.959 { 00:26:51.959 "name": "BaseBdev2", 00:26:51.959 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:51.959 "is_configured": true, 00:26:51.959 "data_offset": 256, 00:26:51.959 "data_size": 7936 00:26:51.959 } 00:26:51.959 ] 00:26:51.959 }' 00:26:51.959 10:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:51.959 10:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:51.959 10:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:51.959 10:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:51.959 10:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:52.216 [2024-07-12 10:53:27.212050] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:52.216 [2024-07-12 10:53:27.212108] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:52.216 [2024-07-12 10:53:27.212191] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.151 "name": "raid_bdev1", 00:26:53.151 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:53.151 "strip_size_kb": 0, 00:26:53.151 "state": "online", 00:26:53.151 "raid_level": "raid1", 00:26:53.151 "superblock": true, 00:26:53.151 "num_base_bdevs": 2, 00:26:53.151 "num_base_bdevs_discovered": 2, 00:26:53.151 "num_base_bdevs_operational": 2, 00:26:53.151 "base_bdevs_list": [ 00:26:53.151 { 00:26:53.151 "name": "spare", 00:26:53.151 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:26:53.151 "is_configured": true, 00:26:53.151 "data_offset": 256, 00:26:53.151 "data_size": 7936 00:26:53.151 }, 00:26:53.151 { 00:26:53.151 "name": "BaseBdev2", 00:26:53.151 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:53.151 "is_configured": true, 00:26:53.151 "data_offset": 256, 00:26:53.151 "data_size": 7936 00:26:53.151 } 00:26:53.151 ] 00:26:53.151 }' 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:53.151 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:53.410 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:53.410 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:26:53.410 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:53.410 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:53.410 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:53.410 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:53.410 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:53.410 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.410 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.668 "name": "raid_bdev1", 00:26:53.668 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:53.668 "strip_size_kb": 0, 00:26:53.668 "state": "online", 00:26:53.668 "raid_level": "raid1", 00:26:53.668 "superblock": true, 00:26:53.668 "num_base_bdevs": 2, 00:26:53.668 "num_base_bdevs_discovered": 2, 00:26:53.668 "num_base_bdevs_operational": 2, 00:26:53.668 "base_bdevs_list": [ 00:26:53.668 { 00:26:53.668 "name": "spare", 00:26:53.668 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:26:53.668 "is_configured": true, 00:26:53.668 "data_offset": 256, 00:26:53.668 "data_size": 7936 00:26:53.668 }, 00:26:53.668 { 00:26:53.668 "name": "BaseBdev2", 00:26:53.668 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:53.668 "is_configured": true, 00:26:53.668 "data_offset": 256, 00:26:53.668 "data_size": 7936 00:26:53.668 } 00:26:53.668 ] 00:26:53.668 }' 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:53.668 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:53.669 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:53.669 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:53.669 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.669 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.927 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:53.927 "name": "raid_bdev1", 00:26:53.927 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:53.927 "strip_size_kb": 0, 00:26:53.927 "state": "online", 00:26:53.927 "raid_level": "raid1", 00:26:53.927 "superblock": true, 00:26:53.927 "num_base_bdevs": 2, 00:26:53.927 "num_base_bdevs_discovered": 2, 00:26:53.927 "num_base_bdevs_operational": 2, 00:26:53.927 "base_bdevs_list": [ 00:26:53.927 { 00:26:53.927 "name": "spare", 00:26:53.927 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:26:53.927 "is_configured": true, 00:26:53.927 "data_offset": 256, 00:26:53.927 "data_size": 7936 00:26:53.927 }, 00:26:53.927 { 00:26:53.927 "name": "BaseBdev2", 00:26:53.927 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:53.927 "is_configured": true, 00:26:53.927 "data_offset": 256, 00:26:53.927 "data_size": 7936 00:26:53.927 } 00:26:53.927 ] 00:26:53.927 }' 00:26:53.927 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:53.927 10:53:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:54.494 10:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:54.753 [2024-07-12 10:53:29.743961] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:54.753 [2024-07-12 10:53:29.743988] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:54.753 [2024-07-12 10:53:29.744050] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:54.753 [2024-07-12 10:53:29.744107] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:54.753 [2024-07-12 10:53:29.744118] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1283070 name raid_bdev1, state offline 00:26:54.753 10:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.753 10:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:55.012 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:55.269 /dev/nbd0 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:55.270 1+0 records in 00:26:55.270 1+0 records out 00:26:55.270 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269716 s, 15.2 MB/s 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:55.270 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:55.528 /dev/nbd1 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:55.528 1+0 records in 00:26:55.528 1+0 records out 00:26:55.528 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00033409 s, 12.3 MB/s 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:55.528 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:55.786 10:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:56.045 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:56.303 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:56.562 [2024-07-12 10:53:31.661995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:56.562 [2024-07-12 10:53:31.662039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:56.562 [2024-07-12 10:53:31.662059] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1282500 00:26:56.562 [2024-07-12 10:53:31.662072] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:56.562 [2024-07-12 10:53:31.663665] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:56.562 [2024-07-12 10:53:31.663692] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:56.562 [2024-07-12 10:53:31.663767] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:56.562 [2024-07-12 10:53:31.663793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:56.562 [2024-07-12 10:53:31.663893] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:56.562 spare 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.562 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.821 [2024-07-12 10:53:31.764206] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1283490 00:26:56.821 [2024-07-12 10:53:31.764226] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:56.821 [2024-07-12 10:53:31.764428] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x127bf50 00:26:56.821 [2024-07-12 10:53:31.764590] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1283490 00:26:56.821 [2024-07-12 10:53:31.764602] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1283490 00:26:56.821 [2024-07-12 10:53:31.764713] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:56.821 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:56.821 "name": "raid_bdev1", 00:26:56.821 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:56.821 "strip_size_kb": 0, 00:26:56.821 "state": "online", 00:26:56.821 "raid_level": "raid1", 00:26:56.821 "superblock": true, 00:26:56.821 "num_base_bdevs": 2, 00:26:56.821 "num_base_bdevs_discovered": 2, 00:26:56.821 "num_base_bdevs_operational": 2, 00:26:56.821 "base_bdevs_list": [ 00:26:56.821 { 00:26:56.821 "name": "spare", 00:26:56.821 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:26:56.821 "is_configured": true, 00:26:56.821 "data_offset": 256, 00:26:56.821 "data_size": 7936 00:26:56.821 }, 00:26:56.821 { 00:26:56.821 "name": "BaseBdev2", 00:26:56.821 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:56.821 "is_configured": true, 00:26:56.821 "data_offset": 256, 00:26:56.821 "data_size": 7936 00:26:56.821 } 00:26:56.821 ] 00:26:56.821 }' 00:26:56.821 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:56.821 10:53:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:57.387 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:57.387 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:57.387 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:57.387 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:57.387 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:57.387 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.387 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.644 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:57.644 "name": "raid_bdev1", 00:26:57.644 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:57.644 "strip_size_kb": 0, 00:26:57.644 "state": "online", 00:26:57.644 "raid_level": "raid1", 00:26:57.644 "superblock": true, 00:26:57.645 "num_base_bdevs": 2, 00:26:57.645 "num_base_bdevs_discovered": 2, 00:26:57.645 "num_base_bdevs_operational": 2, 00:26:57.645 "base_bdevs_list": [ 00:26:57.645 { 00:26:57.645 "name": "spare", 00:26:57.645 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:26:57.645 "is_configured": true, 00:26:57.645 "data_offset": 256, 00:26:57.645 "data_size": 7936 00:26:57.645 }, 00:26:57.645 { 00:26:57.645 "name": "BaseBdev2", 00:26:57.645 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:57.645 "is_configured": true, 00:26:57.645 "data_offset": 256, 00:26:57.645 "data_size": 7936 00:26:57.645 } 00:26:57.645 ] 00:26:57.645 }' 00:26:57.645 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:57.645 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:57.645 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:57.902 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:57.902 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.902 10:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:57.902 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:57.903 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:58.160 [2024-07-12 10:53:33.322507] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.160 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.418 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:58.418 "name": "raid_bdev1", 00:26:58.418 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:26:58.418 "strip_size_kb": 0, 00:26:58.418 "state": "online", 00:26:58.418 "raid_level": "raid1", 00:26:58.418 "superblock": true, 00:26:58.418 "num_base_bdevs": 2, 00:26:58.418 "num_base_bdevs_discovered": 1, 00:26:58.418 "num_base_bdevs_operational": 1, 00:26:58.418 "base_bdevs_list": [ 00:26:58.418 { 00:26:58.418 "name": null, 00:26:58.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.418 "is_configured": false, 00:26:58.418 "data_offset": 256, 00:26:58.418 "data_size": 7936 00:26:58.418 }, 00:26:58.418 { 00:26:58.418 "name": "BaseBdev2", 00:26:58.418 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:26:58.418 "is_configured": true, 00:26:58.418 "data_offset": 256, 00:26:58.418 "data_size": 7936 00:26:58.418 } 00:26:58.418 ] 00:26:58.418 }' 00:26:58.418 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:58.418 10:53:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:59.352 10:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:59.352 [2024-07-12 10:53:34.417415] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:59.352 [2024-07-12 10:53:34.417573] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:59.352 [2024-07-12 10:53:34.417591] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:59.352 [2024-07-12 10:53:34.417619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:59.352 [2024-07-12 10:53:34.422440] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdacfc0 00:26:59.352 [2024-07-12 10:53:34.424814] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:59.352 10:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:00.286 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:00.286 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:00.286 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:00.286 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:00.286 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:00.286 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.286 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:00.544 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:00.544 "name": "raid_bdev1", 00:27:00.544 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:27:00.544 "strip_size_kb": 0, 00:27:00.544 "state": "online", 00:27:00.544 "raid_level": "raid1", 00:27:00.544 "superblock": true, 00:27:00.544 "num_base_bdevs": 2, 00:27:00.544 "num_base_bdevs_discovered": 2, 00:27:00.544 "num_base_bdevs_operational": 2, 00:27:00.544 "process": { 00:27:00.544 "type": "rebuild", 00:27:00.544 "target": "spare", 00:27:00.544 "progress": { 00:27:00.544 "blocks": 3072, 00:27:00.544 "percent": 38 00:27:00.544 } 00:27:00.544 }, 00:27:00.544 "base_bdevs_list": [ 00:27:00.544 { 00:27:00.544 "name": "spare", 00:27:00.544 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:27:00.544 "is_configured": true, 00:27:00.544 "data_offset": 256, 00:27:00.544 "data_size": 7936 00:27:00.544 }, 00:27:00.544 { 00:27:00.544 "name": "BaseBdev2", 00:27:00.544 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:27:00.544 "is_configured": true, 00:27:00.544 "data_offset": 256, 00:27:00.544 "data_size": 7936 00:27:00.544 } 00:27:00.544 ] 00:27:00.544 }' 00:27:00.544 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:00.802 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:00.802 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:00.802 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:00.802 10:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:00.802 [2024-07-12 10:53:35.990616] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:01.060 [2024-07-12 10:53:36.037549] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:01.060 [2024-07-12 10:53:36.037598] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:01.060 [2024-07-12 10:53:36.037614] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:01.060 [2024-07-12 10:53:36.037622] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.060 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:01.318 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:01.318 "name": "raid_bdev1", 00:27:01.318 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:27:01.318 "strip_size_kb": 0, 00:27:01.318 "state": "online", 00:27:01.318 "raid_level": "raid1", 00:27:01.318 "superblock": true, 00:27:01.318 "num_base_bdevs": 2, 00:27:01.318 "num_base_bdevs_discovered": 1, 00:27:01.318 "num_base_bdevs_operational": 1, 00:27:01.318 "base_bdevs_list": [ 00:27:01.318 { 00:27:01.318 "name": null, 00:27:01.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:01.318 "is_configured": false, 00:27:01.318 "data_offset": 256, 00:27:01.318 "data_size": 7936 00:27:01.318 }, 00:27:01.318 { 00:27:01.318 "name": "BaseBdev2", 00:27:01.318 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:27:01.318 "is_configured": true, 00:27:01.318 "data_offset": 256, 00:27:01.319 "data_size": 7936 00:27:01.319 } 00:27:01.319 ] 00:27:01.319 }' 00:27:01.319 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:01.319 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:01.884 10:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:02.142 [2024-07-12 10:53:37.120953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:02.142 [2024-07-12 10:53:37.121006] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.142 [2024-07-12 10:53:37.121029] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10c9a90 00:27:02.142 [2024-07-12 10:53:37.121042] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.142 [2024-07-12 10:53:37.121416] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.142 [2024-07-12 10:53:37.121434] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:02.142 [2024-07-12 10:53:37.121525] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:02.142 [2024-07-12 10:53:37.121538] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:02.142 [2024-07-12 10:53:37.121548] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:02.142 [2024-07-12 10:53:37.121569] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:02.142 [2024-07-12 10:53:37.126708] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10d3500 00:27:02.142 spare 00:27:02.142 [2024-07-12 10:53:37.128187] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:02.142 10:53:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:03.077 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:03.077 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:03.077 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:03.077 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:03.077 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:03.077 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.077 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.335 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:03.335 "name": "raid_bdev1", 00:27:03.335 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:27:03.335 "strip_size_kb": 0, 00:27:03.335 "state": "online", 00:27:03.335 "raid_level": "raid1", 00:27:03.335 "superblock": true, 00:27:03.335 "num_base_bdevs": 2, 00:27:03.335 "num_base_bdevs_discovered": 2, 00:27:03.335 "num_base_bdevs_operational": 2, 00:27:03.335 "process": { 00:27:03.335 "type": "rebuild", 00:27:03.335 "target": "spare", 00:27:03.335 "progress": { 00:27:03.335 "blocks": 3072, 00:27:03.335 "percent": 38 00:27:03.335 } 00:27:03.335 }, 00:27:03.335 "base_bdevs_list": [ 00:27:03.335 { 00:27:03.335 "name": "spare", 00:27:03.335 "uuid": "b90053f4-15d9-5149-bae6-4d8bdeb2080a", 00:27:03.335 "is_configured": true, 00:27:03.335 "data_offset": 256, 00:27:03.335 "data_size": 7936 00:27:03.335 }, 00:27:03.335 { 00:27:03.335 "name": "BaseBdev2", 00:27:03.335 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:27:03.335 "is_configured": true, 00:27:03.335 "data_offset": 256, 00:27:03.335 "data_size": 7936 00:27:03.335 } 00:27:03.335 ] 00:27:03.335 }' 00:27:03.335 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:03.335 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:03.335 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:03.335 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:03.335 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:03.594 [2024-07-12 10:53:38.715077] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:03.594 [2024-07-12 10:53:38.740911] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:03.594 [2024-07-12 10:53:38.740957] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:03.594 [2024-07-12 10:53:38.740972] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:03.594 [2024-07-12 10:53:38.740980] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.594 10:53:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.852 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:03.852 "name": "raid_bdev1", 00:27:03.852 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:27:03.852 "strip_size_kb": 0, 00:27:03.852 "state": "online", 00:27:03.852 "raid_level": "raid1", 00:27:03.852 "superblock": true, 00:27:03.852 "num_base_bdevs": 2, 00:27:03.852 "num_base_bdevs_discovered": 1, 00:27:03.852 "num_base_bdevs_operational": 1, 00:27:03.852 "base_bdevs_list": [ 00:27:03.852 { 00:27:03.852 "name": null, 00:27:03.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:03.852 "is_configured": false, 00:27:03.852 "data_offset": 256, 00:27:03.852 "data_size": 7936 00:27:03.852 }, 00:27:03.852 { 00:27:03.852 "name": "BaseBdev2", 00:27:03.852 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:27:03.852 "is_configured": true, 00:27:03.852 "data_offset": 256, 00:27:03.852 "data_size": 7936 00:27:03.852 } 00:27:03.852 ] 00:27:03.852 }' 00:27:03.852 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:03.852 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:04.417 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:04.417 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:04.417 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:04.417 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:04.417 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:04.417 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.417 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.675 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:04.675 "name": "raid_bdev1", 00:27:04.675 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:27:04.675 "strip_size_kb": 0, 00:27:04.675 "state": "online", 00:27:04.675 "raid_level": "raid1", 00:27:04.675 "superblock": true, 00:27:04.675 "num_base_bdevs": 2, 00:27:04.675 "num_base_bdevs_discovered": 1, 00:27:04.675 "num_base_bdevs_operational": 1, 00:27:04.675 "base_bdevs_list": [ 00:27:04.675 { 00:27:04.675 "name": null, 00:27:04.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:04.675 "is_configured": false, 00:27:04.675 "data_offset": 256, 00:27:04.675 "data_size": 7936 00:27:04.675 }, 00:27:04.675 { 00:27:04.675 "name": "BaseBdev2", 00:27:04.675 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:27:04.675 "is_configured": true, 00:27:04.675 "data_offset": 256, 00:27:04.675 "data_size": 7936 00:27:04.675 } 00:27:04.675 ] 00:27:04.675 }' 00:27:04.675 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:04.933 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:04.933 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:04.933 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:04.933 10:53:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:05.192 10:53:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:05.450 [2024-07-12 10:53:40.409751] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:05.450 [2024-07-12 10:53:40.409801] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:05.450 [2024-07-12 10:53:40.409824] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1282730 00:27:05.450 [2024-07-12 10:53:40.409837] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:05.450 [2024-07-12 10:53:40.410174] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:05.450 [2024-07-12 10:53:40.410191] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:05.450 [2024-07-12 10:53:40.410255] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:05.450 [2024-07-12 10:53:40.410266] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:05.450 [2024-07-12 10:53:40.410276] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:05.450 BaseBdev1 00:27:05.450 10:53:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.384 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.643 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:06.643 "name": "raid_bdev1", 00:27:06.643 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:27:06.643 "strip_size_kb": 0, 00:27:06.643 "state": "online", 00:27:06.643 "raid_level": "raid1", 00:27:06.643 "superblock": true, 00:27:06.643 "num_base_bdevs": 2, 00:27:06.643 "num_base_bdevs_discovered": 1, 00:27:06.643 "num_base_bdevs_operational": 1, 00:27:06.643 "base_bdevs_list": [ 00:27:06.643 { 00:27:06.643 "name": null, 00:27:06.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:06.643 "is_configured": false, 00:27:06.643 "data_offset": 256, 00:27:06.643 "data_size": 7936 00:27:06.643 }, 00:27:06.643 { 00:27:06.643 "name": "BaseBdev2", 00:27:06.643 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:27:06.643 "is_configured": true, 00:27:06.643 "data_offset": 256, 00:27:06.643 "data_size": 7936 00:27:06.643 } 00:27:06.643 ] 00:27:06.643 }' 00:27:06.643 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:06.643 10:53:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:07.209 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:07.209 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:07.209 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:07.209 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:07.209 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:07.209 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.209 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:07.468 "name": "raid_bdev1", 00:27:07.468 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:27:07.468 "strip_size_kb": 0, 00:27:07.468 "state": "online", 00:27:07.468 "raid_level": "raid1", 00:27:07.468 "superblock": true, 00:27:07.468 "num_base_bdevs": 2, 00:27:07.468 "num_base_bdevs_discovered": 1, 00:27:07.468 "num_base_bdevs_operational": 1, 00:27:07.468 "base_bdevs_list": [ 00:27:07.468 { 00:27:07.468 "name": null, 00:27:07.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:07.468 "is_configured": false, 00:27:07.468 "data_offset": 256, 00:27:07.468 "data_size": 7936 00:27:07.468 }, 00:27:07.468 { 00:27:07.468 "name": "BaseBdev2", 00:27:07.468 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:27:07.468 "is_configured": true, 00:27:07.468 "data_offset": 256, 00:27:07.468 "data_size": 7936 00:27:07.468 } 00:27:07.468 ] 00:27:07.468 }' 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:07.468 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:07.727 [2024-07-12 10:53:42.840216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:07.727 [2024-07-12 10:53:42.840337] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:07.727 [2024-07-12 10:53:42.840353] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:07.727 request: 00:27:07.727 { 00:27:07.727 "base_bdev": "BaseBdev1", 00:27:07.727 "raid_bdev": "raid_bdev1", 00:27:07.727 "method": "bdev_raid_add_base_bdev", 00:27:07.727 "req_id": 1 00:27:07.727 } 00:27:07.727 Got JSON-RPC error response 00:27:07.727 response: 00:27:07.727 { 00:27:07.727 "code": -22, 00:27:07.727 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:07.727 } 00:27:07.727 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:27:07.727 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:07.727 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:07.727 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:07.727 10:53:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.707 10:53:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.966 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.966 "name": "raid_bdev1", 00:27:08.966 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:27:08.966 "strip_size_kb": 0, 00:27:08.966 "state": "online", 00:27:08.966 "raid_level": "raid1", 00:27:08.966 "superblock": true, 00:27:08.966 "num_base_bdevs": 2, 00:27:08.966 "num_base_bdevs_discovered": 1, 00:27:08.966 "num_base_bdevs_operational": 1, 00:27:08.966 "base_bdevs_list": [ 00:27:08.966 { 00:27:08.966 "name": null, 00:27:08.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.966 "is_configured": false, 00:27:08.966 "data_offset": 256, 00:27:08.966 "data_size": 7936 00:27:08.966 }, 00:27:08.966 { 00:27:08.966 "name": "BaseBdev2", 00:27:08.966 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:27:08.966 "is_configured": true, 00:27:08.966 "data_offset": 256, 00:27:08.966 "data_size": 7936 00:27:08.966 } 00:27:08.966 ] 00:27:08.966 }' 00:27:08.966 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.966 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:09.533 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:09.533 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:09.533 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:09.533 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:09.533 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:09.533 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.533 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.792 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:09.792 "name": "raid_bdev1", 00:27:09.792 "uuid": "98c0a2d8-18c9-48e9-8657-778868f10374", 00:27:09.792 "strip_size_kb": 0, 00:27:09.792 "state": "online", 00:27:09.792 "raid_level": "raid1", 00:27:09.792 "superblock": true, 00:27:09.792 "num_base_bdevs": 2, 00:27:09.792 "num_base_bdevs_discovered": 1, 00:27:09.792 "num_base_bdevs_operational": 1, 00:27:09.792 "base_bdevs_list": [ 00:27:09.792 { 00:27:09.792 "name": null, 00:27:09.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:09.792 "is_configured": false, 00:27:09.792 "data_offset": 256, 00:27:09.792 "data_size": 7936 00:27:09.792 }, 00:27:09.792 { 00:27:09.792 "name": "BaseBdev2", 00:27:09.792 "uuid": "8fe796cf-a9d5-5f88-95d2-47e5d648bf60", 00:27:09.792 "is_configured": true, 00:27:09.792 "data_offset": 256, 00:27:09.792 "data_size": 7936 00:27:09.792 } 00:27:09.792 ] 00:27:09.792 }' 00:27:09.792 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:09.792 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:09.792 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:09.792 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:09.792 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 2157214 00:27:09.792 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2157214 ']' 00:27:10.051 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2157214 00:27:10.051 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:10.051 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:10.051 10:53:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2157214 00:27:10.051 10:53:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:10.051 10:53:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:10.051 10:53:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2157214' 00:27:10.051 killing process with pid 2157214 00:27:10.051 10:53:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2157214 00:27:10.051 Received shutdown signal, test time was about 60.000000 seconds 00:27:10.051 00:27:10.051 Latency(us) 00:27:10.051 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:10.051 =================================================================================================================== 00:27:10.051 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:10.051 [2024-07-12 10:53:45.036431] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:10.051 [2024-07-12 10:53:45.036528] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:10.051 [2024-07-12 10:53:45.036573] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:10.051 [2024-07-12 10:53:45.036585] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1283490 name raid_bdev1, state offline 00:27:10.051 10:53:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2157214 00:27:10.051 [2024-07-12 10:53:45.063159] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:10.311 10:53:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:27:10.311 00:27:10.311 real 0m30.258s 00:27:10.311 user 0m47.277s 00:27:10.311 sys 0m5.030s 00:27:10.311 10:53:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:10.311 10:53:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:10.311 ************************************ 00:27:10.311 END TEST raid_rebuild_test_sb_4k 00:27:10.311 ************************************ 00:27:10.311 10:53:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:10.311 10:53:45 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:27:10.311 10:53:45 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:27:10.311 10:53:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:10.311 10:53:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:10.311 10:53:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:10.311 ************************************ 00:27:10.311 START TEST raid_state_function_test_sb_md_separate 00:27:10.311 ************************************ 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=2161540 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2161540' 00:27:10.311 Process raid pid: 2161540 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 2161540 /var/tmp/spdk-raid.sock 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2161540 ']' 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:10.311 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:10.311 10:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:10.311 [2024-07-12 10:53:45.406532] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:27:10.311 [2024-07-12 10:53:45.406601] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:10.571 [2024-07-12 10:53:45.534849] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:10.571 [2024-07-12 10:53:45.631086] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:10.571 [2024-07-12 10:53:45.694383] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:10.571 [2024-07-12 10:53:45.694418] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:11.138 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:11.138 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:11.138 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:11.397 [2024-07-12 10:53:46.479168] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:11.397 [2024-07-12 10:53:46.479212] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:11.397 [2024-07-12 10:53:46.479223] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:11.397 [2024-07-12 10:53:46.479234] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.397 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:11.656 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:11.656 "name": "Existed_Raid", 00:27:11.656 "uuid": "0fdc441a-2249-4b81-b622-9982727ce219", 00:27:11.656 "strip_size_kb": 0, 00:27:11.656 "state": "configuring", 00:27:11.656 "raid_level": "raid1", 00:27:11.656 "superblock": true, 00:27:11.656 "num_base_bdevs": 2, 00:27:11.656 "num_base_bdevs_discovered": 0, 00:27:11.656 "num_base_bdevs_operational": 2, 00:27:11.656 "base_bdevs_list": [ 00:27:11.656 { 00:27:11.656 "name": "BaseBdev1", 00:27:11.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:11.656 "is_configured": false, 00:27:11.656 "data_offset": 0, 00:27:11.656 "data_size": 0 00:27:11.656 }, 00:27:11.656 { 00:27:11.656 "name": "BaseBdev2", 00:27:11.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:11.656 "is_configured": false, 00:27:11.656 "data_offset": 0, 00:27:11.656 "data_size": 0 00:27:11.656 } 00:27:11.656 ] 00:27:11.656 }' 00:27:11.656 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:11.656 10:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:12.223 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:12.482 [2024-07-12 10:53:47.473667] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:12.482 [2024-07-12 10:53:47.473698] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1568a80 name Existed_Raid, state configuring 00:27:12.482 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:12.482 [2024-07-12 10:53:47.650147] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:12.482 [2024-07-12 10:53:47.650176] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:12.482 [2024-07-12 10:53:47.650185] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:12.482 [2024-07-12 10:53:47.650197] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:12.482 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:27:12.741 [2024-07-12 10:53:47.832962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:12.741 BaseBdev1 00:27:12.741 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:12.741 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:12.741 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:12.741 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:12.741 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:12.741 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:12.741 10:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:13.000 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:13.000 [ 00:27:13.000 { 00:27:13.000 "name": "BaseBdev1", 00:27:13.000 "aliases": [ 00:27:13.000 "8ffd19fd-ff32-47db-913d-8605e0181fe9" 00:27:13.000 ], 00:27:13.000 "product_name": "Malloc disk", 00:27:13.000 "block_size": 4096, 00:27:13.000 "num_blocks": 8192, 00:27:13.000 "uuid": "8ffd19fd-ff32-47db-913d-8605e0181fe9", 00:27:13.000 "md_size": 32, 00:27:13.000 "md_interleave": false, 00:27:13.000 "dif_type": 0, 00:27:13.000 "assigned_rate_limits": { 00:27:13.000 "rw_ios_per_sec": 0, 00:27:13.000 "rw_mbytes_per_sec": 0, 00:27:13.000 "r_mbytes_per_sec": 0, 00:27:13.000 "w_mbytes_per_sec": 0 00:27:13.000 }, 00:27:13.000 "claimed": true, 00:27:13.000 "claim_type": "exclusive_write", 00:27:13.000 "zoned": false, 00:27:13.000 "supported_io_types": { 00:27:13.000 "read": true, 00:27:13.000 "write": true, 00:27:13.000 "unmap": true, 00:27:13.000 "flush": true, 00:27:13.000 "reset": true, 00:27:13.000 "nvme_admin": false, 00:27:13.000 "nvme_io": false, 00:27:13.000 "nvme_io_md": false, 00:27:13.000 "write_zeroes": true, 00:27:13.000 "zcopy": true, 00:27:13.000 "get_zone_info": false, 00:27:13.000 "zone_management": false, 00:27:13.000 "zone_append": false, 00:27:13.000 "compare": false, 00:27:13.000 "compare_and_write": false, 00:27:13.000 "abort": true, 00:27:13.000 "seek_hole": false, 00:27:13.000 "seek_data": false, 00:27:13.000 "copy": true, 00:27:13.000 "nvme_iov_md": false 00:27:13.000 }, 00:27:13.000 "memory_domains": [ 00:27:13.000 { 00:27:13.000 "dma_device_id": "system", 00:27:13.000 "dma_device_type": 1 00:27:13.000 }, 00:27:13.000 { 00:27:13.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.000 "dma_device_type": 2 00:27:13.000 } 00:27:13.000 ], 00:27:13.000 "driver_specific": {} 00:27:13.000 } 00:27:13.000 ] 00:27:13.259 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:13.259 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:13.259 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:13.259 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:13.259 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.259 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.259 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:13.259 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.260 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.260 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.260 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.260 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.260 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:13.260 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:13.260 "name": "Existed_Raid", 00:27:13.260 "uuid": "f4b93896-40fd-471f-af81-8e2afaea0905", 00:27:13.260 "strip_size_kb": 0, 00:27:13.260 "state": "configuring", 00:27:13.260 "raid_level": "raid1", 00:27:13.260 "superblock": true, 00:27:13.260 "num_base_bdevs": 2, 00:27:13.260 "num_base_bdevs_discovered": 1, 00:27:13.260 "num_base_bdevs_operational": 2, 00:27:13.260 "base_bdevs_list": [ 00:27:13.260 { 00:27:13.260 "name": "BaseBdev1", 00:27:13.260 "uuid": "8ffd19fd-ff32-47db-913d-8605e0181fe9", 00:27:13.260 "is_configured": true, 00:27:13.260 "data_offset": 256, 00:27:13.260 "data_size": 7936 00:27:13.260 }, 00:27:13.260 { 00:27:13.260 "name": "BaseBdev2", 00:27:13.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.260 "is_configured": false, 00:27:13.260 "data_offset": 0, 00:27:13.260 "data_size": 0 00:27:13.260 } 00:27:13.260 ] 00:27:13.260 }' 00:27:13.260 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:13.260 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:13.828 10:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:14.086 [2024-07-12 10:53:49.196606] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:14.086 [2024-07-12 10:53:49.196646] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1568350 name Existed_Raid, state configuring 00:27:14.086 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:14.345 [2024-07-12 10:53:49.441296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:14.345 [2024-07-12 10:53:49.442775] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:14.345 [2024-07-12 10:53:49.442808] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.345 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:14.603 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:14.603 "name": "Existed_Raid", 00:27:14.603 "uuid": "e93fe915-adc2-4e56-b922-70d0f81bec8d", 00:27:14.603 "strip_size_kb": 0, 00:27:14.603 "state": "configuring", 00:27:14.603 "raid_level": "raid1", 00:27:14.603 "superblock": true, 00:27:14.604 "num_base_bdevs": 2, 00:27:14.604 "num_base_bdevs_discovered": 1, 00:27:14.604 "num_base_bdevs_operational": 2, 00:27:14.604 "base_bdevs_list": [ 00:27:14.604 { 00:27:14.604 "name": "BaseBdev1", 00:27:14.604 "uuid": "8ffd19fd-ff32-47db-913d-8605e0181fe9", 00:27:14.604 "is_configured": true, 00:27:14.604 "data_offset": 256, 00:27:14.604 "data_size": 7936 00:27:14.604 }, 00:27:14.604 { 00:27:14.604 "name": "BaseBdev2", 00:27:14.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:14.604 "is_configured": false, 00:27:14.604 "data_offset": 0, 00:27:14.604 "data_size": 0 00:27:14.604 } 00:27:14.604 ] 00:27:14.604 }' 00:27:14.604 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:14.604 10:53:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:15.170 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:27:15.429 [2024-07-12 10:53:50.383887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:15.429 [2024-07-12 10:53:50.384037] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x156a210 00:27:15.429 [2024-07-12 10:53:50.384051] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:15.429 [2024-07-12 10:53:50.384112] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1569c50 00:27:15.429 [2024-07-12 10:53:50.384207] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x156a210 00:27:15.429 [2024-07-12 10:53:50.384217] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x156a210 00:27:15.429 [2024-07-12 10:53:50.384283] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:15.429 BaseBdev2 00:27:15.429 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:15.429 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:15.429 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:15.429 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:15.429 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:15.429 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:15.429 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:15.429 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:15.687 [ 00:27:15.687 { 00:27:15.687 "name": "BaseBdev2", 00:27:15.687 "aliases": [ 00:27:15.687 "d87cf05a-a800-485a-a724-ce3b968e48e5" 00:27:15.687 ], 00:27:15.687 "product_name": "Malloc disk", 00:27:15.687 "block_size": 4096, 00:27:15.687 "num_blocks": 8192, 00:27:15.687 "uuid": "d87cf05a-a800-485a-a724-ce3b968e48e5", 00:27:15.687 "md_size": 32, 00:27:15.687 "md_interleave": false, 00:27:15.687 "dif_type": 0, 00:27:15.687 "assigned_rate_limits": { 00:27:15.687 "rw_ios_per_sec": 0, 00:27:15.687 "rw_mbytes_per_sec": 0, 00:27:15.687 "r_mbytes_per_sec": 0, 00:27:15.687 "w_mbytes_per_sec": 0 00:27:15.687 }, 00:27:15.687 "claimed": true, 00:27:15.687 "claim_type": "exclusive_write", 00:27:15.687 "zoned": false, 00:27:15.687 "supported_io_types": { 00:27:15.687 "read": true, 00:27:15.687 "write": true, 00:27:15.687 "unmap": true, 00:27:15.687 "flush": true, 00:27:15.687 "reset": true, 00:27:15.687 "nvme_admin": false, 00:27:15.687 "nvme_io": false, 00:27:15.687 "nvme_io_md": false, 00:27:15.687 "write_zeroes": true, 00:27:15.687 "zcopy": true, 00:27:15.687 "get_zone_info": false, 00:27:15.687 "zone_management": false, 00:27:15.687 "zone_append": false, 00:27:15.687 "compare": false, 00:27:15.687 "compare_and_write": false, 00:27:15.687 "abort": true, 00:27:15.687 "seek_hole": false, 00:27:15.687 "seek_data": false, 00:27:15.687 "copy": true, 00:27:15.687 "nvme_iov_md": false 00:27:15.687 }, 00:27:15.687 "memory_domains": [ 00:27:15.687 { 00:27:15.687 "dma_device_id": "system", 00:27:15.687 "dma_device_type": 1 00:27:15.687 }, 00:27:15.687 { 00:27:15.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:15.687 "dma_device_type": 2 00:27:15.687 } 00:27:15.687 ], 00:27:15.687 "driver_specific": {} 00:27:15.687 } 00:27:15.688 ] 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.688 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:15.946 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:15.946 "name": "Existed_Raid", 00:27:15.946 "uuid": "e93fe915-adc2-4e56-b922-70d0f81bec8d", 00:27:15.946 "strip_size_kb": 0, 00:27:15.946 "state": "online", 00:27:15.946 "raid_level": "raid1", 00:27:15.946 "superblock": true, 00:27:15.946 "num_base_bdevs": 2, 00:27:15.946 "num_base_bdevs_discovered": 2, 00:27:15.946 "num_base_bdevs_operational": 2, 00:27:15.946 "base_bdevs_list": [ 00:27:15.946 { 00:27:15.946 "name": "BaseBdev1", 00:27:15.946 "uuid": "8ffd19fd-ff32-47db-913d-8605e0181fe9", 00:27:15.946 "is_configured": true, 00:27:15.946 "data_offset": 256, 00:27:15.946 "data_size": 7936 00:27:15.946 }, 00:27:15.946 { 00:27:15.946 "name": "BaseBdev2", 00:27:15.946 "uuid": "d87cf05a-a800-485a-a724-ce3b968e48e5", 00:27:15.946 "is_configured": true, 00:27:15.946 "data_offset": 256, 00:27:15.946 "data_size": 7936 00:27:15.946 } 00:27:15.946 ] 00:27:15.946 }' 00:27:15.946 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:15.946 10:53:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:16.513 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:16.513 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:16.513 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:16.513 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:16.513 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:16.513 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:16.513 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:16.513 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:16.772 [2024-07-12 10:53:51.719873] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:16.772 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:16.772 "name": "Existed_Raid", 00:27:16.772 "aliases": [ 00:27:16.772 "e93fe915-adc2-4e56-b922-70d0f81bec8d" 00:27:16.772 ], 00:27:16.772 "product_name": "Raid Volume", 00:27:16.772 "block_size": 4096, 00:27:16.772 "num_blocks": 7936, 00:27:16.772 "uuid": "e93fe915-adc2-4e56-b922-70d0f81bec8d", 00:27:16.772 "md_size": 32, 00:27:16.772 "md_interleave": false, 00:27:16.772 "dif_type": 0, 00:27:16.772 "assigned_rate_limits": { 00:27:16.772 "rw_ios_per_sec": 0, 00:27:16.772 "rw_mbytes_per_sec": 0, 00:27:16.772 "r_mbytes_per_sec": 0, 00:27:16.772 "w_mbytes_per_sec": 0 00:27:16.772 }, 00:27:16.772 "claimed": false, 00:27:16.772 "zoned": false, 00:27:16.772 "supported_io_types": { 00:27:16.772 "read": true, 00:27:16.772 "write": true, 00:27:16.772 "unmap": false, 00:27:16.772 "flush": false, 00:27:16.772 "reset": true, 00:27:16.772 "nvme_admin": false, 00:27:16.772 "nvme_io": false, 00:27:16.772 "nvme_io_md": false, 00:27:16.772 "write_zeroes": true, 00:27:16.772 "zcopy": false, 00:27:16.772 "get_zone_info": false, 00:27:16.772 "zone_management": false, 00:27:16.772 "zone_append": false, 00:27:16.772 "compare": false, 00:27:16.772 "compare_and_write": false, 00:27:16.772 "abort": false, 00:27:16.772 "seek_hole": false, 00:27:16.772 "seek_data": false, 00:27:16.772 "copy": false, 00:27:16.772 "nvme_iov_md": false 00:27:16.772 }, 00:27:16.772 "memory_domains": [ 00:27:16.772 { 00:27:16.772 "dma_device_id": "system", 00:27:16.772 "dma_device_type": 1 00:27:16.772 }, 00:27:16.772 { 00:27:16.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:16.772 "dma_device_type": 2 00:27:16.772 }, 00:27:16.772 { 00:27:16.772 "dma_device_id": "system", 00:27:16.772 "dma_device_type": 1 00:27:16.772 }, 00:27:16.772 { 00:27:16.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:16.772 "dma_device_type": 2 00:27:16.772 } 00:27:16.772 ], 00:27:16.772 "driver_specific": { 00:27:16.772 "raid": { 00:27:16.772 "uuid": "e93fe915-adc2-4e56-b922-70d0f81bec8d", 00:27:16.772 "strip_size_kb": 0, 00:27:16.772 "state": "online", 00:27:16.772 "raid_level": "raid1", 00:27:16.772 "superblock": true, 00:27:16.772 "num_base_bdevs": 2, 00:27:16.772 "num_base_bdevs_discovered": 2, 00:27:16.772 "num_base_bdevs_operational": 2, 00:27:16.772 "base_bdevs_list": [ 00:27:16.772 { 00:27:16.772 "name": "BaseBdev1", 00:27:16.772 "uuid": "8ffd19fd-ff32-47db-913d-8605e0181fe9", 00:27:16.772 "is_configured": true, 00:27:16.772 "data_offset": 256, 00:27:16.772 "data_size": 7936 00:27:16.772 }, 00:27:16.772 { 00:27:16.772 "name": "BaseBdev2", 00:27:16.772 "uuid": "d87cf05a-a800-485a-a724-ce3b968e48e5", 00:27:16.772 "is_configured": true, 00:27:16.772 "data_offset": 256, 00:27:16.772 "data_size": 7936 00:27:16.772 } 00:27:16.772 ] 00:27:16.772 } 00:27:16.772 } 00:27:16.772 }' 00:27:16.772 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:16.772 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:16.772 BaseBdev2' 00:27:16.772 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:16.772 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:16.772 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:16.772 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:16.772 "name": "BaseBdev1", 00:27:16.772 "aliases": [ 00:27:16.772 "8ffd19fd-ff32-47db-913d-8605e0181fe9" 00:27:16.772 ], 00:27:16.772 "product_name": "Malloc disk", 00:27:16.772 "block_size": 4096, 00:27:16.772 "num_blocks": 8192, 00:27:16.772 "uuid": "8ffd19fd-ff32-47db-913d-8605e0181fe9", 00:27:16.772 "md_size": 32, 00:27:16.772 "md_interleave": false, 00:27:16.772 "dif_type": 0, 00:27:16.772 "assigned_rate_limits": { 00:27:16.772 "rw_ios_per_sec": 0, 00:27:16.772 "rw_mbytes_per_sec": 0, 00:27:16.772 "r_mbytes_per_sec": 0, 00:27:16.772 "w_mbytes_per_sec": 0 00:27:16.772 }, 00:27:16.772 "claimed": true, 00:27:16.772 "claim_type": "exclusive_write", 00:27:16.772 "zoned": false, 00:27:16.772 "supported_io_types": { 00:27:16.772 "read": true, 00:27:16.772 "write": true, 00:27:16.772 "unmap": true, 00:27:16.772 "flush": true, 00:27:16.772 "reset": true, 00:27:16.772 "nvme_admin": false, 00:27:16.772 "nvme_io": false, 00:27:16.772 "nvme_io_md": false, 00:27:16.772 "write_zeroes": true, 00:27:16.772 "zcopy": true, 00:27:16.772 "get_zone_info": false, 00:27:16.772 "zone_management": false, 00:27:16.772 "zone_append": false, 00:27:16.772 "compare": false, 00:27:16.772 "compare_and_write": false, 00:27:16.772 "abort": true, 00:27:16.772 "seek_hole": false, 00:27:16.772 "seek_data": false, 00:27:16.772 "copy": true, 00:27:16.772 "nvme_iov_md": false 00:27:16.772 }, 00:27:16.772 "memory_domains": [ 00:27:16.772 { 00:27:16.772 "dma_device_id": "system", 00:27:16.772 "dma_device_type": 1 00:27:16.772 }, 00:27:16.772 { 00:27:16.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:16.772 "dma_device_type": 2 00:27:16.772 } 00:27:16.772 ], 00:27:16.772 "driver_specific": {} 00:27:16.772 }' 00:27:16.772 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:17.031 10:53:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:17.031 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:17.031 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:17.031 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:17.031 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:17.031 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:17.031 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:17.031 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:17.031 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:17.290 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:17.290 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:17.290 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:17.290 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:17.290 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:17.855 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:17.855 "name": "BaseBdev2", 00:27:17.855 "aliases": [ 00:27:17.855 "d87cf05a-a800-485a-a724-ce3b968e48e5" 00:27:17.855 ], 00:27:17.855 "product_name": "Malloc disk", 00:27:17.855 "block_size": 4096, 00:27:17.855 "num_blocks": 8192, 00:27:17.855 "uuid": "d87cf05a-a800-485a-a724-ce3b968e48e5", 00:27:17.855 "md_size": 32, 00:27:17.855 "md_interleave": false, 00:27:17.855 "dif_type": 0, 00:27:17.856 "assigned_rate_limits": { 00:27:17.856 "rw_ios_per_sec": 0, 00:27:17.856 "rw_mbytes_per_sec": 0, 00:27:17.856 "r_mbytes_per_sec": 0, 00:27:17.856 "w_mbytes_per_sec": 0 00:27:17.856 }, 00:27:17.856 "claimed": true, 00:27:17.856 "claim_type": "exclusive_write", 00:27:17.856 "zoned": false, 00:27:17.856 "supported_io_types": { 00:27:17.856 "read": true, 00:27:17.856 "write": true, 00:27:17.856 "unmap": true, 00:27:17.856 "flush": true, 00:27:17.856 "reset": true, 00:27:17.856 "nvme_admin": false, 00:27:17.856 "nvme_io": false, 00:27:17.856 "nvme_io_md": false, 00:27:17.856 "write_zeroes": true, 00:27:17.856 "zcopy": true, 00:27:17.856 "get_zone_info": false, 00:27:17.856 "zone_management": false, 00:27:17.856 "zone_append": false, 00:27:17.856 "compare": false, 00:27:17.856 "compare_and_write": false, 00:27:17.856 "abort": true, 00:27:17.856 "seek_hole": false, 00:27:17.856 "seek_data": false, 00:27:17.856 "copy": true, 00:27:17.856 "nvme_iov_md": false 00:27:17.856 }, 00:27:17.856 "memory_domains": [ 00:27:17.856 { 00:27:17.856 "dma_device_id": "system", 00:27:17.856 "dma_device_type": 1 00:27:17.856 }, 00:27:17.856 { 00:27:17.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:17.856 "dma_device_type": 2 00:27:17.856 } 00:27:17.856 ], 00:27:17.856 "driver_specific": {} 00:27:17.856 }' 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:17.856 10:53:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:17.856 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:17.856 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:17.856 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:18.114 [2024-07-12 10:53:53.187518] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.114 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:18.372 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:18.372 "name": "Existed_Raid", 00:27:18.372 "uuid": "e93fe915-adc2-4e56-b922-70d0f81bec8d", 00:27:18.372 "strip_size_kb": 0, 00:27:18.372 "state": "online", 00:27:18.372 "raid_level": "raid1", 00:27:18.372 "superblock": true, 00:27:18.372 "num_base_bdevs": 2, 00:27:18.372 "num_base_bdevs_discovered": 1, 00:27:18.372 "num_base_bdevs_operational": 1, 00:27:18.372 "base_bdevs_list": [ 00:27:18.372 { 00:27:18.372 "name": null, 00:27:18.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.372 "is_configured": false, 00:27:18.372 "data_offset": 256, 00:27:18.372 "data_size": 7936 00:27:18.372 }, 00:27:18.372 { 00:27:18.372 "name": "BaseBdev2", 00:27:18.372 "uuid": "d87cf05a-a800-485a-a724-ce3b968e48e5", 00:27:18.372 "is_configured": true, 00:27:18.372 "data_offset": 256, 00:27:18.372 "data_size": 7936 00:27:18.372 } 00:27:18.372 ] 00:27:18.372 }' 00:27:18.372 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:18.372 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:18.939 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:18.939 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:18.939 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.939 10:53:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:19.198 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:19.198 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:19.198 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:19.456 [2024-07-12 10:53:54.467404] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:19.456 [2024-07-12 10:53:54.467499] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:19.456 [2024-07-12 10:53:54.479713] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:19.456 [2024-07-12 10:53:54.479747] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:19.456 [2024-07-12 10:53:54.479758] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x156a210 name Existed_Raid, state offline 00:27:19.456 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:19.456 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:19.456 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.456 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 2161540 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2161540 ']' 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2161540 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2161540 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2161540' 00:27:19.715 killing process with pid 2161540 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2161540 00:27:19.715 [2024-07-12 10:53:54.729854] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:19.715 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2161540 00:27:19.715 [2024-07-12 10:53:54.730837] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:19.974 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:27:19.974 00:27:19.974 real 0m9.612s 00:27:19.974 user 0m17.019s 00:27:19.974 sys 0m1.857s 00:27:19.974 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:19.974 10:53:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:19.974 ************************************ 00:27:19.974 END TEST raid_state_function_test_sb_md_separate 00:27:19.974 ************************************ 00:27:19.974 10:53:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:19.974 10:53:54 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:27:19.974 10:53:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:19.974 10:53:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:19.974 10:53:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:19.974 ************************************ 00:27:19.974 START TEST raid_superblock_test_md_separate 00:27:19.974 ************************************ 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=2162988 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 2162988 /var/tmp/spdk-raid.sock 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2162988 ']' 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:19.974 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:19.974 10:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:19.974 [2024-07-12 10:53:55.095211] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:27:19.974 [2024-07-12 10:53:55.095274] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2162988 ] 00:27:20.232 [2024-07-12 10:53:55.223217] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:20.232 [2024-07-12 10:53:55.329461] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:20.232 [2024-07-12 10:53:55.399779] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:20.232 [2024-07-12 10:53:55.399817] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:27:21.169 malloc1 00:27:21.169 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:21.428 [2024-07-12 10:53:56.483121] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:21.428 [2024-07-12 10:53:56.483169] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:21.428 [2024-07-12 10:53:56.483191] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22e8830 00:27:21.428 [2024-07-12 10:53:56.483204] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:21.428 [2024-07-12 10:53:56.484744] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:21.428 [2024-07-12 10:53:56.484772] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:21.428 pt1 00:27:21.428 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:21.428 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:21.428 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:21.428 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:21.428 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:21.428 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:21.428 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:21.428 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:21.428 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:27:21.686 malloc2 00:27:21.686 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:21.946 [2024-07-12 10:53:56.898614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:21.946 [2024-07-12 10:53:56.898660] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:21.946 [2024-07-12 10:53:56.898682] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22da250 00:27:21.946 [2024-07-12 10:53:56.898695] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:21.946 [2024-07-12 10:53:56.900151] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:21.946 [2024-07-12 10:53:56.900178] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:21.946 pt2 00:27:21.946 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:21.946 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:21.946 10:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:21.946 [2024-07-12 10:53:57.067077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:21.946 [2024-07-12 10:53:57.068477] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:21.946 [2024-07-12 10:53:57.068634] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22dad20 00:27:21.946 [2024-07-12 10:53:57.068648] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:21.946 [2024-07-12 10:53:57.068724] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22cea60 00:27:21.946 [2024-07-12 10:53:57.068842] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22dad20 00:27:21.946 [2024-07-12 10:53:57.068853] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22dad20 00:27:21.946 [2024-07-12 10:53:57.068931] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.946 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.204 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:22.204 "name": "raid_bdev1", 00:27:22.204 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:22.204 "strip_size_kb": 0, 00:27:22.204 "state": "online", 00:27:22.204 "raid_level": "raid1", 00:27:22.204 "superblock": true, 00:27:22.204 "num_base_bdevs": 2, 00:27:22.204 "num_base_bdevs_discovered": 2, 00:27:22.204 "num_base_bdevs_operational": 2, 00:27:22.204 "base_bdevs_list": [ 00:27:22.204 { 00:27:22.204 "name": "pt1", 00:27:22.204 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:22.204 "is_configured": true, 00:27:22.204 "data_offset": 256, 00:27:22.204 "data_size": 7936 00:27:22.204 }, 00:27:22.204 { 00:27:22.204 "name": "pt2", 00:27:22.204 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:22.204 "is_configured": true, 00:27:22.204 "data_offset": 256, 00:27:22.204 "data_size": 7936 00:27:22.204 } 00:27:22.204 ] 00:27:22.204 }' 00:27:22.204 10:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:22.204 10:53:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:23.137 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:23.137 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:23.137 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:23.137 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:23.137 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:23.137 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:23.137 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:23.137 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:23.394 [2024-07-12 10:53:58.398892] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:23.394 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:23.394 "name": "raid_bdev1", 00:27:23.394 "aliases": [ 00:27:23.394 "42b159df-00f6-4dee-becc-460a910605a2" 00:27:23.394 ], 00:27:23.394 "product_name": "Raid Volume", 00:27:23.394 "block_size": 4096, 00:27:23.394 "num_blocks": 7936, 00:27:23.395 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:23.395 "md_size": 32, 00:27:23.395 "md_interleave": false, 00:27:23.395 "dif_type": 0, 00:27:23.395 "assigned_rate_limits": { 00:27:23.395 "rw_ios_per_sec": 0, 00:27:23.395 "rw_mbytes_per_sec": 0, 00:27:23.395 "r_mbytes_per_sec": 0, 00:27:23.395 "w_mbytes_per_sec": 0 00:27:23.395 }, 00:27:23.395 "claimed": false, 00:27:23.395 "zoned": false, 00:27:23.395 "supported_io_types": { 00:27:23.395 "read": true, 00:27:23.395 "write": true, 00:27:23.395 "unmap": false, 00:27:23.395 "flush": false, 00:27:23.395 "reset": true, 00:27:23.395 "nvme_admin": false, 00:27:23.395 "nvme_io": false, 00:27:23.395 "nvme_io_md": false, 00:27:23.395 "write_zeroes": true, 00:27:23.395 "zcopy": false, 00:27:23.395 "get_zone_info": false, 00:27:23.395 "zone_management": false, 00:27:23.395 "zone_append": false, 00:27:23.395 "compare": false, 00:27:23.395 "compare_and_write": false, 00:27:23.395 "abort": false, 00:27:23.395 "seek_hole": false, 00:27:23.395 "seek_data": false, 00:27:23.395 "copy": false, 00:27:23.395 "nvme_iov_md": false 00:27:23.395 }, 00:27:23.395 "memory_domains": [ 00:27:23.395 { 00:27:23.395 "dma_device_id": "system", 00:27:23.395 "dma_device_type": 1 00:27:23.395 }, 00:27:23.395 { 00:27:23.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.395 "dma_device_type": 2 00:27:23.395 }, 00:27:23.395 { 00:27:23.395 "dma_device_id": "system", 00:27:23.395 "dma_device_type": 1 00:27:23.395 }, 00:27:23.395 { 00:27:23.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.395 "dma_device_type": 2 00:27:23.395 } 00:27:23.395 ], 00:27:23.395 "driver_specific": { 00:27:23.395 "raid": { 00:27:23.395 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:23.395 "strip_size_kb": 0, 00:27:23.395 "state": "online", 00:27:23.395 "raid_level": "raid1", 00:27:23.395 "superblock": true, 00:27:23.395 "num_base_bdevs": 2, 00:27:23.395 "num_base_bdevs_discovered": 2, 00:27:23.395 "num_base_bdevs_operational": 2, 00:27:23.395 "base_bdevs_list": [ 00:27:23.395 { 00:27:23.395 "name": "pt1", 00:27:23.395 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:23.395 "is_configured": true, 00:27:23.395 "data_offset": 256, 00:27:23.395 "data_size": 7936 00:27:23.395 }, 00:27:23.395 { 00:27:23.395 "name": "pt2", 00:27:23.395 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:23.395 "is_configured": true, 00:27:23.395 "data_offset": 256, 00:27:23.395 "data_size": 7936 00:27:23.395 } 00:27:23.395 ] 00:27:23.395 } 00:27:23.395 } 00:27:23.395 }' 00:27:23.395 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:23.395 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:23.395 pt2' 00:27:23.395 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:23.395 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:23.395 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:23.686 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:23.686 "name": "pt1", 00:27:23.686 "aliases": [ 00:27:23.686 "00000000-0000-0000-0000-000000000001" 00:27:23.686 ], 00:27:23.686 "product_name": "passthru", 00:27:23.686 "block_size": 4096, 00:27:23.686 "num_blocks": 8192, 00:27:23.686 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:23.686 "md_size": 32, 00:27:23.686 "md_interleave": false, 00:27:23.686 "dif_type": 0, 00:27:23.686 "assigned_rate_limits": { 00:27:23.686 "rw_ios_per_sec": 0, 00:27:23.686 "rw_mbytes_per_sec": 0, 00:27:23.686 "r_mbytes_per_sec": 0, 00:27:23.686 "w_mbytes_per_sec": 0 00:27:23.686 }, 00:27:23.686 "claimed": true, 00:27:23.686 "claim_type": "exclusive_write", 00:27:23.686 "zoned": false, 00:27:23.686 "supported_io_types": { 00:27:23.686 "read": true, 00:27:23.686 "write": true, 00:27:23.686 "unmap": true, 00:27:23.686 "flush": true, 00:27:23.686 "reset": true, 00:27:23.686 "nvme_admin": false, 00:27:23.686 "nvme_io": false, 00:27:23.686 "nvme_io_md": false, 00:27:23.686 "write_zeroes": true, 00:27:23.686 "zcopy": true, 00:27:23.686 "get_zone_info": false, 00:27:23.686 "zone_management": false, 00:27:23.686 "zone_append": false, 00:27:23.686 "compare": false, 00:27:23.686 "compare_and_write": false, 00:27:23.686 "abort": true, 00:27:23.686 "seek_hole": false, 00:27:23.686 "seek_data": false, 00:27:23.686 "copy": true, 00:27:23.686 "nvme_iov_md": false 00:27:23.686 }, 00:27:23.686 "memory_domains": [ 00:27:23.686 { 00:27:23.686 "dma_device_id": "system", 00:27:23.686 "dma_device_type": 1 00:27:23.686 }, 00:27:23.686 { 00:27:23.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.686 "dma_device_type": 2 00:27:23.686 } 00:27:23.686 ], 00:27:23.686 "driver_specific": { 00:27:23.686 "passthru": { 00:27:23.686 "name": "pt1", 00:27:23.686 "base_bdev_name": "malloc1" 00:27:23.686 } 00:27:23.686 } 00:27:23.686 }' 00:27:23.686 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:23.686 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:23.686 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:23.686 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:23.686 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:23.945 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:23.945 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.945 10:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.945 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:23.945 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.945 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.945 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:23.945 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:23.945 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:23.945 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:24.202 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:24.202 "name": "pt2", 00:27:24.202 "aliases": [ 00:27:24.202 "00000000-0000-0000-0000-000000000002" 00:27:24.202 ], 00:27:24.202 "product_name": "passthru", 00:27:24.202 "block_size": 4096, 00:27:24.202 "num_blocks": 8192, 00:27:24.202 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:24.202 "md_size": 32, 00:27:24.202 "md_interleave": false, 00:27:24.202 "dif_type": 0, 00:27:24.202 "assigned_rate_limits": { 00:27:24.202 "rw_ios_per_sec": 0, 00:27:24.202 "rw_mbytes_per_sec": 0, 00:27:24.202 "r_mbytes_per_sec": 0, 00:27:24.202 "w_mbytes_per_sec": 0 00:27:24.202 }, 00:27:24.202 "claimed": true, 00:27:24.202 "claim_type": "exclusive_write", 00:27:24.202 "zoned": false, 00:27:24.202 "supported_io_types": { 00:27:24.202 "read": true, 00:27:24.202 "write": true, 00:27:24.202 "unmap": true, 00:27:24.202 "flush": true, 00:27:24.202 "reset": true, 00:27:24.202 "nvme_admin": false, 00:27:24.202 "nvme_io": false, 00:27:24.202 "nvme_io_md": false, 00:27:24.202 "write_zeroes": true, 00:27:24.202 "zcopy": true, 00:27:24.202 "get_zone_info": false, 00:27:24.202 "zone_management": false, 00:27:24.202 "zone_append": false, 00:27:24.202 "compare": false, 00:27:24.202 "compare_and_write": false, 00:27:24.202 "abort": true, 00:27:24.202 "seek_hole": false, 00:27:24.202 "seek_data": false, 00:27:24.202 "copy": true, 00:27:24.202 "nvme_iov_md": false 00:27:24.202 }, 00:27:24.202 "memory_domains": [ 00:27:24.202 { 00:27:24.202 "dma_device_id": "system", 00:27:24.202 "dma_device_type": 1 00:27:24.202 }, 00:27:24.202 { 00:27:24.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:24.202 "dma_device_type": 2 00:27:24.202 } 00:27:24.202 ], 00:27:24.202 "driver_specific": { 00:27:24.202 "passthru": { 00:27:24.202 "name": "pt2", 00:27:24.202 "base_bdev_name": "malloc2" 00:27:24.202 } 00:27:24.202 } 00:27:24.202 }' 00:27:24.202 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:24.202 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:24.484 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:24.484 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:24.484 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:24.484 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:24.484 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:24.484 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:24.484 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:24.484 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:24.484 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:24.741 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:24.741 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:24.741 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:24.741 [2024-07-12 10:53:59.918906] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:24.999 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=42b159df-00f6-4dee-becc-460a910605a2 00:27:24.999 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 42b159df-00f6-4dee-becc-460a910605a2 ']' 00:27:24.999 10:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:24.999 [2024-07-12 10:54:00.167303] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:24.999 [2024-07-12 10:54:00.167332] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:24.999 [2024-07-12 10:54:00.167396] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:24.999 [2024-07-12 10:54:00.167449] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:24.999 [2024-07-12 10:54:00.167461] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22dad20 name raid_bdev1, state offline 00:27:24.999 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.999 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:25.257 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:25.257 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:25.257 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:25.257 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:25.515 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:25.515 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:25.772 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:25.772 10:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:26.031 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:26.290 [2024-07-12 10:54:01.406544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:26.290 [2024-07-12 10:54:01.407883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:26.290 [2024-07-12 10:54:01.407942] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:26.290 [2024-07-12 10:54:01.407983] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:26.290 [2024-07-12 10:54:01.408002] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:26.290 [2024-07-12 10:54:01.408011] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x214aed0 name raid_bdev1, state configuring 00:27:26.290 request: 00:27:26.290 { 00:27:26.290 "name": "raid_bdev1", 00:27:26.290 "raid_level": "raid1", 00:27:26.290 "base_bdevs": [ 00:27:26.290 "malloc1", 00:27:26.290 "malloc2" 00:27:26.290 ], 00:27:26.290 "superblock": false, 00:27:26.290 "method": "bdev_raid_create", 00:27:26.290 "req_id": 1 00:27:26.290 } 00:27:26.290 Got JSON-RPC error response 00:27:26.290 response: 00:27:26.290 { 00:27:26.290 "code": -17, 00:27:26.290 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:26.290 } 00:27:26.290 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:27:26.290 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:26.290 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:26.290 10:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:26.290 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.290 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:26.549 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:26.549 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:26.549 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:26.807 [2024-07-12 10:54:01.843628] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:26.807 [2024-07-12 10:54:01.843672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:26.807 [2024-07-12 10:54:01.843690] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22e8ee0 00:27:26.807 [2024-07-12 10:54:01.843704] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:26.807 [2024-07-12 10:54:01.845066] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:26.807 [2024-07-12 10:54:01.845092] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:26.807 [2024-07-12 10:54:01.845135] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:26.807 [2024-07-12 10:54:01.845157] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:26.807 pt1 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.807 10:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.066 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:27.066 "name": "raid_bdev1", 00:27:27.066 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:27.066 "strip_size_kb": 0, 00:27:27.066 "state": "configuring", 00:27:27.066 "raid_level": "raid1", 00:27:27.066 "superblock": true, 00:27:27.066 "num_base_bdevs": 2, 00:27:27.066 "num_base_bdevs_discovered": 1, 00:27:27.066 "num_base_bdevs_operational": 2, 00:27:27.067 "base_bdevs_list": [ 00:27:27.067 { 00:27:27.067 "name": "pt1", 00:27:27.067 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:27.067 "is_configured": true, 00:27:27.067 "data_offset": 256, 00:27:27.067 "data_size": 7936 00:27:27.067 }, 00:27:27.067 { 00:27:27.067 "name": null, 00:27:27.067 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:27.067 "is_configured": false, 00:27:27.067 "data_offset": 256, 00:27:27.067 "data_size": 7936 00:27:27.067 } 00:27:27.067 ] 00:27:27.067 }' 00:27:27.067 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:27.067 10:54:02 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:27.632 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:27.632 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:27.632 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:27.632 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:27.891 [2024-07-12 10:54:02.918564] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:27.891 [2024-07-12 10:54:02.918617] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:27.891 [2024-07-12 10:54:02.918636] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x214b490 00:27:27.891 [2024-07-12 10:54:02.918649] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:27.891 [2024-07-12 10:54:02.918840] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:27.891 [2024-07-12 10:54:02.918857] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:27.891 [2024-07-12 10:54:02.918901] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:27.891 [2024-07-12 10:54:02.918918] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:27.891 [2024-07-12 10:54:02.919006] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22cf5d0 00:27:27.891 [2024-07-12 10:54:02.919017] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:27.891 [2024-07-12 10:54:02.919071] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d0800 00:27:27.891 [2024-07-12 10:54:02.919171] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22cf5d0 00:27:27.891 [2024-07-12 10:54:02.919181] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22cf5d0 00:27:27.891 [2024-07-12 10:54:02.919249] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:27.891 pt2 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.891 10:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:28.149 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:28.149 "name": "raid_bdev1", 00:27:28.149 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:28.149 "strip_size_kb": 0, 00:27:28.149 "state": "online", 00:27:28.149 "raid_level": "raid1", 00:27:28.149 "superblock": true, 00:27:28.149 "num_base_bdevs": 2, 00:27:28.149 "num_base_bdevs_discovered": 2, 00:27:28.149 "num_base_bdevs_operational": 2, 00:27:28.149 "base_bdevs_list": [ 00:27:28.149 { 00:27:28.149 "name": "pt1", 00:27:28.150 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:28.150 "is_configured": true, 00:27:28.150 "data_offset": 256, 00:27:28.150 "data_size": 7936 00:27:28.150 }, 00:27:28.150 { 00:27:28.150 "name": "pt2", 00:27:28.150 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:28.150 "is_configured": true, 00:27:28.150 "data_offset": 256, 00:27:28.150 "data_size": 7936 00:27:28.150 } 00:27:28.150 ] 00:27:28.150 }' 00:27:28.150 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:28.150 10:54:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:28.718 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:28.718 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:28.718 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:28.718 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:28.718 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:28.718 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:28.718 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:28.718 10:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:28.978 [2024-07-12 10:54:04.013724] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:28.978 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:28.978 "name": "raid_bdev1", 00:27:28.978 "aliases": [ 00:27:28.978 "42b159df-00f6-4dee-becc-460a910605a2" 00:27:28.978 ], 00:27:28.978 "product_name": "Raid Volume", 00:27:28.978 "block_size": 4096, 00:27:28.978 "num_blocks": 7936, 00:27:28.978 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:28.978 "md_size": 32, 00:27:28.978 "md_interleave": false, 00:27:28.978 "dif_type": 0, 00:27:28.978 "assigned_rate_limits": { 00:27:28.978 "rw_ios_per_sec": 0, 00:27:28.978 "rw_mbytes_per_sec": 0, 00:27:28.978 "r_mbytes_per_sec": 0, 00:27:28.978 "w_mbytes_per_sec": 0 00:27:28.978 }, 00:27:28.978 "claimed": false, 00:27:28.978 "zoned": false, 00:27:28.978 "supported_io_types": { 00:27:28.978 "read": true, 00:27:28.978 "write": true, 00:27:28.978 "unmap": false, 00:27:28.978 "flush": false, 00:27:28.978 "reset": true, 00:27:28.978 "nvme_admin": false, 00:27:28.978 "nvme_io": false, 00:27:28.978 "nvme_io_md": false, 00:27:28.978 "write_zeroes": true, 00:27:28.978 "zcopy": false, 00:27:28.978 "get_zone_info": false, 00:27:28.978 "zone_management": false, 00:27:28.978 "zone_append": false, 00:27:28.978 "compare": false, 00:27:28.978 "compare_and_write": false, 00:27:28.978 "abort": false, 00:27:28.978 "seek_hole": false, 00:27:28.978 "seek_data": false, 00:27:28.978 "copy": false, 00:27:28.978 "nvme_iov_md": false 00:27:28.978 }, 00:27:28.978 "memory_domains": [ 00:27:28.978 { 00:27:28.978 "dma_device_id": "system", 00:27:28.978 "dma_device_type": 1 00:27:28.978 }, 00:27:28.978 { 00:27:28.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:28.978 "dma_device_type": 2 00:27:28.978 }, 00:27:28.978 { 00:27:28.978 "dma_device_id": "system", 00:27:28.978 "dma_device_type": 1 00:27:28.978 }, 00:27:28.978 { 00:27:28.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:28.978 "dma_device_type": 2 00:27:28.978 } 00:27:28.978 ], 00:27:28.978 "driver_specific": { 00:27:28.978 "raid": { 00:27:28.978 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:28.978 "strip_size_kb": 0, 00:27:28.978 "state": "online", 00:27:28.978 "raid_level": "raid1", 00:27:28.978 "superblock": true, 00:27:28.978 "num_base_bdevs": 2, 00:27:28.978 "num_base_bdevs_discovered": 2, 00:27:28.978 "num_base_bdevs_operational": 2, 00:27:28.978 "base_bdevs_list": [ 00:27:28.978 { 00:27:28.978 "name": "pt1", 00:27:28.978 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:28.978 "is_configured": true, 00:27:28.978 "data_offset": 256, 00:27:28.978 "data_size": 7936 00:27:28.978 }, 00:27:28.978 { 00:27:28.978 "name": "pt2", 00:27:28.978 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:28.978 "is_configured": true, 00:27:28.978 "data_offset": 256, 00:27:28.978 "data_size": 7936 00:27:28.978 } 00:27:28.978 ] 00:27:28.978 } 00:27:28.978 } 00:27:28.978 }' 00:27:28.978 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:28.978 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:28.978 pt2' 00:27:28.978 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:28.978 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:28.978 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:29.238 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:29.238 "name": "pt1", 00:27:29.238 "aliases": [ 00:27:29.238 "00000000-0000-0000-0000-000000000001" 00:27:29.238 ], 00:27:29.238 "product_name": "passthru", 00:27:29.238 "block_size": 4096, 00:27:29.238 "num_blocks": 8192, 00:27:29.238 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:29.238 "md_size": 32, 00:27:29.238 "md_interleave": false, 00:27:29.238 "dif_type": 0, 00:27:29.238 "assigned_rate_limits": { 00:27:29.238 "rw_ios_per_sec": 0, 00:27:29.238 "rw_mbytes_per_sec": 0, 00:27:29.238 "r_mbytes_per_sec": 0, 00:27:29.238 "w_mbytes_per_sec": 0 00:27:29.238 }, 00:27:29.238 "claimed": true, 00:27:29.238 "claim_type": "exclusive_write", 00:27:29.238 "zoned": false, 00:27:29.238 "supported_io_types": { 00:27:29.238 "read": true, 00:27:29.238 "write": true, 00:27:29.238 "unmap": true, 00:27:29.238 "flush": true, 00:27:29.238 "reset": true, 00:27:29.238 "nvme_admin": false, 00:27:29.238 "nvme_io": false, 00:27:29.238 "nvme_io_md": false, 00:27:29.238 "write_zeroes": true, 00:27:29.238 "zcopy": true, 00:27:29.238 "get_zone_info": false, 00:27:29.238 "zone_management": false, 00:27:29.238 "zone_append": false, 00:27:29.238 "compare": false, 00:27:29.238 "compare_and_write": false, 00:27:29.238 "abort": true, 00:27:29.238 "seek_hole": false, 00:27:29.238 "seek_data": false, 00:27:29.238 "copy": true, 00:27:29.238 "nvme_iov_md": false 00:27:29.238 }, 00:27:29.238 "memory_domains": [ 00:27:29.238 { 00:27:29.238 "dma_device_id": "system", 00:27:29.238 "dma_device_type": 1 00:27:29.238 }, 00:27:29.238 { 00:27:29.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:29.238 "dma_device_type": 2 00:27:29.238 } 00:27:29.238 ], 00:27:29.238 "driver_specific": { 00:27:29.238 "passthru": { 00:27:29.238 "name": "pt1", 00:27:29.238 "base_bdev_name": "malloc1" 00:27:29.238 } 00:27:29.238 } 00:27:29.238 }' 00:27:29.238 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:29.238 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:29.238 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:29.238 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:29.497 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:29.755 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:29.755 "name": "pt2", 00:27:29.755 "aliases": [ 00:27:29.755 "00000000-0000-0000-0000-000000000002" 00:27:29.755 ], 00:27:29.755 "product_name": "passthru", 00:27:29.755 "block_size": 4096, 00:27:29.755 "num_blocks": 8192, 00:27:29.755 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:29.755 "md_size": 32, 00:27:29.755 "md_interleave": false, 00:27:29.755 "dif_type": 0, 00:27:29.755 "assigned_rate_limits": { 00:27:29.755 "rw_ios_per_sec": 0, 00:27:29.755 "rw_mbytes_per_sec": 0, 00:27:29.755 "r_mbytes_per_sec": 0, 00:27:29.755 "w_mbytes_per_sec": 0 00:27:29.755 }, 00:27:29.755 "claimed": true, 00:27:29.755 "claim_type": "exclusive_write", 00:27:29.755 "zoned": false, 00:27:29.755 "supported_io_types": { 00:27:29.756 "read": true, 00:27:29.756 "write": true, 00:27:29.756 "unmap": true, 00:27:29.756 "flush": true, 00:27:29.756 "reset": true, 00:27:29.756 "nvme_admin": false, 00:27:29.756 "nvme_io": false, 00:27:29.756 "nvme_io_md": false, 00:27:29.756 "write_zeroes": true, 00:27:29.756 "zcopy": true, 00:27:29.756 "get_zone_info": false, 00:27:29.756 "zone_management": false, 00:27:29.756 "zone_append": false, 00:27:29.756 "compare": false, 00:27:29.756 "compare_and_write": false, 00:27:29.756 "abort": true, 00:27:29.756 "seek_hole": false, 00:27:29.756 "seek_data": false, 00:27:29.756 "copy": true, 00:27:29.756 "nvme_iov_md": false 00:27:29.756 }, 00:27:29.756 "memory_domains": [ 00:27:29.756 { 00:27:29.756 "dma_device_id": "system", 00:27:29.756 "dma_device_type": 1 00:27:29.756 }, 00:27:29.756 { 00:27:29.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:29.756 "dma_device_type": 2 00:27:29.756 } 00:27:29.756 ], 00:27:29.756 "driver_specific": { 00:27:29.756 "passthru": { 00:27:29.756 "name": "pt2", 00:27:29.756 "base_bdev_name": "malloc2" 00:27:29.756 } 00:27:29.756 } 00:27:29.756 }' 00:27:29.756 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:30.014 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:30.014 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:30.014 10:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:30.014 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:30.014 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:30.014 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:30.014 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:30.014 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:30.014 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:30.273 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:30.273 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:30.273 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:30.273 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:30.531 [2024-07-12 10:54:05.493664] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:30.531 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 42b159df-00f6-4dee-becc-460a910605a2 '!=' 42b159df-00f6-4dee-becc-460a910605a2 ']' 00:27:30.531 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:30.531 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:30.531 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:30.531 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:30.789 [2024-07-12 10:54:05.742063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:30.789 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:30.789 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:30.789 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:30.789 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:30.789 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:30.789 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:30.789 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:30.789 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:30.790 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:30.790 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:30.790 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:30.790 10:54:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.048 10:54:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:31.048 "name": "raid_bdev1", 00:27:31.048 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:31.048 "strip_size_kb": 0, 00:27:31.048 "state": "online", 00:27:31.048 "raid_level": "raid1", 00:27:31.048 "superblock": true, 00:27:31.048 "num_base_bdevs": 2, 00:27:31.048 "num_base_bdevs_discovered": 1, 00:27:31.048 "num_base_bdevs_operational": 1, 00:27:31.048 "base_bdevs_list": [ 00:27:31.048 { 00:27:31.048 "name": null, 00:27:31.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.048 "is_configured": false, 00:27:31.048 "data_offset": 256, 00:27:31.048 "data_size": 7936 00:27:31.048 }, 00:27:31.048 { 00:27:31.048 "name": "pt2", 00:27:31.048 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:31.048 "is_configured": true, 00:27:31.048 "data_offset": 256, 00:27:31.048 "data_size": 7936 00:27:31.048 } 00:27:31.048 ] 00:27:31.048 }' 00:27:31.048 10:54:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:31.048 10:54:06 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:31.615 10:54:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:31.615 [2024-07-12 10:54:06.800843] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:31.615 [2024-07-12 10:54:06.800869] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:31.615 [2024-07-12 10:54:06.800923] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:31.615 [2024-07-12 10:54:06.800969] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:31.615 [2024-07-12 10:54:06.800981] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22cf5d0 name raid_bdev1, state offline 00:27:31.872 10:54:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.872 10:54:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:31.872 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:31.873 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:31.873 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:31.873 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:32.130 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:32.131 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:32.131 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:32.131 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:32.131 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:32.131 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:27:32.131 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:32.389 [2024-07-12 10:54:07.534741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:32.389 [2024-07-12 10:54:07.534793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:32.389 [2024-07-12 10:54:07.534814] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22cd660 00:27:32.389 [2024-07-12 10:54:07.534826] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:32.389 [2024-07-12 10:54:07.536311] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:32.389 [2024-07-12 10:54:07.536337] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:32.389 [2024-07-12 10:54:07.536387] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:32.389 [2024-07-12 10:54:07.536411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:32.389 [2024-07-12 10:54:07.536498] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22cfd10 00:27:32.389 [2024-07-12 10:54:07.536508] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:32.389 [2024-07-12 10:54:07.536564] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d0560 00:27:32.389 [2024-07-12 10:54:07.536659] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22cfd10 00:27:32.389 [2024-07-12 10:54:07.536669] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22cfd10 00:27:32.389 [2024-07-12 10:54:07.536736] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:32.389 pt2 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.389 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.648 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.648 "name": "raid_bdev1", 00:27:32.648 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:32.648 "strip_size_kb": 0, 00:27:32.648 "state": "online", 00:27:32.648 "raid_level": "raid1", 00:27:32.648 "superblock": true, 00:27:32.648 "num_base_bdevs": 2, 00:27:32.648 "num_base_bdevs_discovered": 1, 00:27:32.648 "num_base_bdevs_operational": 1, 00:27:32.648 "base_bdevs_list": [ 00:27:32.648 { 00:27:32.648 "name": null, 00:27:32.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.648 "is_configured": false, 00:27:32.648 "data_offset": 256, 00:27:32.648 "data_size": 7936 00:27:32.648 }, 00:27:32.648 { 00:27:32.648 "name": "pt2", 00:27:32.648 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:32.648 "is_configured": true, 00:27:32.648 "data_offset": 256, 00:27:32.648 "data_size": 7936 00:27:32.648 } 00:27:32.648 ] 00:27:32.648 }' 00:27:32.648 10:54:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.648 10:54:07 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:33.216 10:54:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:33.475 [2024-07-12 10:54:08.605675] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:33.475 [2024-07-12 10:54:08.605703] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:33.475 [2024-07-12 10:54:08.605758] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:33.475 [2024-07-12 10:54:08.605801] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:33.475 [2024-07-12 10:54:08.605812] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22cfd10 name raid_bdev1, state offline 00:27:33.475 10:54:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.475 10:54:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:33.734 10:54:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:33.734 10:54:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:33.734 10:54:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:33.734 10:54:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:33.993 [2024-07-12 10:54:09.094945] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:33.993 [2024-07-12 10:54:09.094994] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:33.993 [2024-07-12 10:54:09.095013] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22ce760 00:27:33.993 [2024-07-12 10:54:09.095025] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:33.993 [2024-07-12 10:54:09.096450] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:33.993 [2024-07-12 10:54:09.096477] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:33.993 [2024-07-12 10:54:09.096531] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:33.993 [2024-07-12 10:54:09.096554] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:33.993 [2024-07-12 10:54:09.096645] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:33.993 [2024-07-12 10:54:09.096658] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:33.993 [2024-07-12 10:54:09.096674] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d0850 name raid_bdev1, state configuring 00:27:33.994 [2024-07-12 10:54:09.096696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:33.994 [2024-07-12 10:54:09.096749] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22cf850 00:27:33.994 [2024-07-12 10:54:09.096759] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:33.994 [2024-07-12 10:54:09.096817] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d03b0 00:27:33.994 [2024-07-12 10:54:09.096913] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22cf850 00:27:33.994 [2024-07-12 10:54:09.096922] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22cf850 00:27:33.994 [2024-07-12 10:54:09.096996] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:33.994 pt1 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:33.994 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.253 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:34.253 "name": "raid_bdev1", 00:27:34.253 "uuid": "42b159df-00f6-4dee-becc-460a910605a2", 00:27:34.253 "strip_size_kb": 0, 00:27:34.253 "state": "online", 00:27:34.253 "raid_level": "raid1", 00:27:34.253 "superblock": true, 00:27:34.253 "num_base_bdevs": 2, 00:27:34.253 "num_base_bdevs_discovered": 1, 00:27:34.253 "num_base_bdevs_operational": 1, 00:27:34.253 "base_bdevs_list": [ 00:27:34.253 { 00:27:34.253 "name": null, 00:27:34.253 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.253 "is_configured": false, 00:27:34.253 "data_offset": 256, 00:27:34.253 "data_size": 7936 00:27:34.253 }, 00:27:34.253 { 00:27:34.253 "name": "pt2", 00:27:34.253 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:34.253 "is_configured": true, 00:27:34.253 "data_offset": 256, 00:27:34.253 "data_size": 7936 00:27:34.253 } 00:27:34.253 ] 00:27:34.253 }' 00:27:34.253 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:34.253 10:54:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:34.820 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:34.820 10:54:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:35.078 10:54:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:35.078 10:54:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:35.078 10:54:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:35.336 [2024-07-12 10:54:10.346514] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 42b159df-00f6-4dee-becc-460a910605a2 '!=' 42b159df-00f6-4dee-becc-460a910605a2 ']' 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 2162988 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2162988 ']' 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 2162988 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2162988 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2162988' 00:27:35.336 killing process with pid 2162988 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 2162988 00:27:35.336 [2024-07-12 10:54:10.432554] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:35.336 [2024-07-12 10:54:10.432609] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:35.336 [2024-07-12 10:54:10.432652] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:35.336 [2024-07-12 10:54:10.432664] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22cf850 name raid_bdev1, state offline 00:27:35.336 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 2162988 00:27:35.336 [2024-07-12 10:54:10.455561] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:35.595 10:54:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:27:35.595 00:27:35.595 real 0m15.630s 00:27:35.595 user 0m28.301s 00:27:35.595 sys 0m2.901s 00:27:35.595 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:35.595 10:54:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:35.595 ************************************ 00:27:35.595 END TEST raid_superblock_test_md_separate 00:27:35.595 ************************************ 00:27:35.595 10:54:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:35.595 10:54:10 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:27:35.595 10:54:10 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:27:35.595 10:54:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:35.596 10:54:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:35.596 10:54:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:35.596 ************************************ 00:27:35.596 START TEST raid_rebuild_test_sb_md_separate 00:27:35.596 ************************************ 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=2165755 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 2165755 /var/tmp/spdk-raid.sock 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2165755 ']' 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:35.596 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:35.596 10:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:35.855 [2024-07-12 10:54:10.806312] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:27:35.855 [2024-07-12 10:54:10.806376] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2165755 ] 00:27:35.855 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:35.855 Zero copy mechanism will not be used. 00:27:35.855 [2024-07-12 10:54:10.932897] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:35.855 [2024-07-12 10:54:11.034940] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:36.114 [2024-07-12 10:54:11.096655] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:36.114 [2024-07-12 10:54:11.096692] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:36.681 10:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:36.681 10:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:36.681 10:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:36.681 10:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:27:36.940 BaseBdev1_malloc 00:27:36.940 10:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:37.199 [2024-07-12 10:54:12.201951] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:37.199 [2024-07-12 10:54:12.202000] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:37.199 [2024-07-12 10:54:12.202025] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23226d0 00:27:37.199 [2024-07-12 10:54:12.202044] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:37.199 [2024-07-12 10:54:12.203559] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:37.199 [2024-07-12 10:54:12.203587] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:37.199 BaseBdev1 00:27:37.199 10:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:37.199 10:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:27:37.458 BaseBdev2_malloc 00:27:37.458 10:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:37.716 [2024-07-12 10:54:12.690018] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:37.716 [2024-07-12 10:54:12.690065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:37.716 [2024-07-12 10:54:12.690086] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x247a1f0 00:27:37.716 [2024-07-12 10:54:12.690099] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:37.716 [2024-07-12 10:54:12.691495] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:37.716 [2024-07-12 10:54:12.691523] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:37.716 BaseBdev2 00:27:37.716 10:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:27:37.975 spare_malloc 00:27:37.975 10:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:38.234 spare_delay 00:27:38.234 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:38.234 [2024-07-12 10:54:13.421228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:38.234 [2024-07-12 10:54:13.421275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:38.234 [2024-07-12 10:54:13.421297] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24767a0 00:27:38.234 [2024-07-12 10:54:13.421310] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:38.234 [2024-07-12 10:54:13.422725] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:38.234 [2024-07-12 10:54:13.422753] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:38.234 spare 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:38.493 [2024-07-12 10:54:13.653868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:38.493 [2024-07-12 10:54:13.655203] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:38.493 [2024-07-12 10:54:13.655373] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24771c0 00:27:38.493 [2024-07-12 10:54:13.655386] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:38.493 [2024-07-12 10:54:13.655464] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2388360 00:27:38.493 [2024-07-12 10:54:13.655589] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24771c0 00:27:38.493 [2024-07-12 10:54:13.655599] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24771c0 00:27:38.493 [2024-07-12 10:54:13.655674] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.493 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.752 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:38.752 "name": "raid_bdev1", 00:27:38.752 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:38.752 "strip_size_kb": 0, 00:27:38.752 "state": "online", 00:27:38.752 "raid_level": "raid1", 00:27:38.752 "superblock": true, 00:27:38.752 "num_base_bdevs": 2, 00:27:38.752 "num_base_bdevs_discovered": 2, 00:27:38.752 "num_base_bdevs_operational": 2, 00:27:38.752 "base_bdevs_list": [ 00:27:38.752 { 00:27:38.752 "name": "BaseBdev1", 00:27:38.752 "uuid": "32b99183-1d8f-58c7-8da3-fb5d530ac485", 00:27:38.752 "is_configured": true, 00:27:38.752 "data_offset": 256, 00:27:38.752 "data_size": 7936 00:27:38.752 }, 00:27:38.752 { 00:27:38.752 "name": "BaseBdev2", 00:27:38.752 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:38.752 "is_configured": true, 00:27:38.752 "data_offset": 256, 00:27:38.752 "data_size": 7936 00:27:38.752 } 00:27:38.752 ] 00:27:38.752 }' 00:27:38.752 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:38.752 10:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:39.686 10:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:39.686 10:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:39.686 [2024-07-12 10:54:14.757027] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:39.686 10:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:39.686 10:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.686 10:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:39.945 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:40.204 [2024-07-12 10:54:15.258139] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2388360 00:27:40.204 /dev/nbd0 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:40.204 1+0 records in 00:27:40.204 1+0 records out 00:27:40.204 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253859 s, 16.1 MB/s 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:40.204 10:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:41.142 7936+0 records in 00:27:41.142 7936+0 records out 00:27:41.142 32505856 bytes (33 MB, 31 MiB) copied, 0.755997 s, 43.0 MB/s 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:41.142 [2024-07-12 10:54:16.272477] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:41.142 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:41.460 [2024-07-12 10:54:16.504831] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.460 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.719 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:41.719 "name": "raid_bdev1", 00:27:41.719 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:41.719 "strip_size_kb": 0, 00:27:41.719 "state": "online", 00:27:41.719 "raid_level": "raid1", 00:27:41.719 "superblock": true, 00:27:41.719 "num_base_bdevs": 2, 00:27:41.719 "num_base_bdevs_discovered": 1, 00:27:41.719 "num_base_bdevs_operational": 1, 00:27:41.719 "base_bdevs_list": [ 00:27:41.719 { 00:27:41.719 "name": null, 00:27:41.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:41.719 "is_configured": false, 00:27:41.719 "data_offset": 256, 00:27:41.719 "data_size": 7936 00:27:41.719 }, 00:27:41.719 { 00:27:41.719 "name": "BaseBdev2", 00:27:41.719 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:41.719 "is_configured": true, 00:27:41.719 "data_offset": 256, 00:27:41.719 "data_size": 7936 00:27:41.719 } 00:27:41.719 ] 00:27:41.719 }' 00:27:41.719 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:41.719 10:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:42.286 10:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:42.545 [2024-07-12 10:54:17.543667] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:42.545 [2024-07-12 10:54:17.545950] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2321350 00:27:42.545 [2024-07-12 10:54:17.548237] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:42.545 10:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:43.477 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:43.477 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:43.477 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:43.477 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:43.477 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:43.477 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.477 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.734 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:43.734 "name": "raid_bdev1", 00:27:43.734 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:43.734 "strip_size_kb": 0, 00:27:43.734 "state": "online", 00:27:43.734 "raid_level": "raid1", 00:27:43.734 "superblock": true, 00:27:43.734 "num_base_bdevs": 2, 00:27:43.734 "num_base_bdevs_discovered": 2, 00:27:43.734 "num_base_bdevs_operational": 2, 00:27:43.735 "process": { 00:27:43.735 "type": "rebuild", 00:27:43.735 "target": "spare", 00:27:43.735 "progress": { 00:27:43.735 "blocks": 2816, 00:27:43.735 "percent": 35 00:27:43.735 } 00:27:43.735 }, 00:27:43.735 "base_bdevs_list": [ 00:27:43.735 { 00:27:43.735 "name": "spare", 00:27:43.735 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:43.735 "is_configured": true, 00:27:43.735 "data_offset": 256, 00:27:43.735 "data_size": 7936 00:27:43.735 }, 00:27:43.735 { 00:27:43.735 "name": "BaseBdev2", 00:27:43.735 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:43.735 "is_configured": true, 00:27:43.735 "data_offset": 256, 00:27:43.735 "data_size": 7936 00:27:43.735 } 00:27:43.735 ] 00:27:43.735 }' 00:27:43.735 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:43.735 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:43.735 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:43.735 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:43.735 10:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:43.992 [2024-07-12 10:54:19.057143] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:43.992 [2024-07-12 10:54:19.060058] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:43.992 [2024-07-12 10:54:19.060104] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:43.992 [2024-07-12 10:54:19.060119] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:43.992 [2024-07-12 10:54:19.060127] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.992 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.251 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:44.251 "name": "raid_bdev1", 00:27:44.251 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:44.251 "strip_size_kb": 0, 00:27:44.251 "state": "online", 00:27:44.251 "raid_level": "raid1", 00:27:44.251 "superblock": true, 00:27:44.251 "num_base_bdevs": 2, 00:27:44.251 "num_base_bdevs_discovered": 1, 00:27:44.251 "num_base_bdevs_operational": 1, 00:27:44.251 "base_bdevs_list": [ 00:27:44.251 { 00:27:44.251 "name": null, 00:27:44.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:44.251 "is_configured": false, 00:27:44.251 "data_offset": 256, 00:27:44.251 "data_size": 7936 00:27:44.251 }, 00:27:44.251 { 00:27:44.251 "name": "BaseBdev2", 00:27:44.251 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:44.251 "is_configured": true, 00:27:44.251 "data_offset": 256, 00:27:44.251 "data_size": 7936 00:27:44.251 } 00:27:44.251 ] 00:27:44.251 }' 00:27:44.251 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:44.251 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:44.817 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:44.817 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:44.817 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:44.817 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:44.817 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:44.817 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.817 10:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.074 10:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:45.074 "name": "raid_bdev1", 00:27:45.074 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:45.074 "strip_size_kb": 0, 00:27:45.074 "state": "online", 00:27:45.074 "raid_level": "raid1", 00:27:45.074 "superblock": true, 00:27:45.074 "num_base_bdevs": 2, 00:27:45.074 "num_base_bdevs_discovered": 1, 00:27:45.074 "num_base_bdevs_operational": 1, 00:27:45.074 "base_bdevs_list": [ 00:27:45.074 { 00:27:45.074 "name": null, 00:27:45.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:45.074 "is_configured": false, 00:27:45.074 "data_offset": 256, 00:27:45.074 "data_size": 7936 00:27:45.074 }, 00:27:45.074 { 00:27:45.074 "name": "BaseBdev2", 00:27:45.074 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:45.074 "is_configured": true, 00:27:45.074 "data_offset": 256, 00:27:45.074 "data_size": 7936 00:27:45.074 } 00:27:45.074 ] 00:27:45.074 }' 00:27:45.074 10:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:45.074 10:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:45.074 10:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:45.074 10:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:45.074 10:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:45.330 [2024-07-12 10:54:20.475527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:45.330 [2024-07-12 10:54:20.478009] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2322280 00:27:45.330 [2024-07-12 10:54:20.479588] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:45.330 10:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:46.700 "name": "raid_bdev1", 00:27:46.700 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:46.700 "strip_size_kb": 0, 00:27:46.700 "state": "online", 00:27:46.700 "raid_level": "raid1", 00:27:46.700 "superblock": true, 00:27:46.700 "num_base_bdevs": 2, 00:27:46.700 "num_base_bdevs_discovered": 2, 00:27:46.700 "num_base_bdevs_operational": 2, 00:27:46.700 "process": { 00:27:46.700 "type": "rebuild", 00:27:46.700 "target": "spare", 00:27:46.700 "progress": { 00:27:46.700 "blocks": 2816, 00:27:46.700 "percent": 35 00:27:46.700 } 00:27:46.700 }, 00:27:46.700 "base_bdevs_list": [ 00:27:46.700 { 00:27:46.700 "name": "spare", 00:27:46.700 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:46.700 "is_configured": true, 00:27:46.700 "data_offset": 256, 00:27:46.700 "data_size": 7936 00:27:46.700 }, 00:27:46.700 { 00:27:46.700 "name": "BaseBdev2", 00:27:46.700 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:46.700 "is_configured": true, 00:27:46.700 "data_offset": 256, 00:27:46.700 "data_size": 7936 00:27:46.700 } 00:27:46.700 ] 00:27:46.700 }' 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:46.700 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1057 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.700 10:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.265 10:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:47.265 "name": "raid_bdev1", 00:27:47.265 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:47.265 "strip_size_kb": 0, 00:27:47.265 "state": "online", 00:27:47.265 "raid_level": "raid1", 00:27:47.265 "superblock": true, 00:27:47.265 "num_base_bdevs": 2, 00:27:47.265 "num_base_bdevs_discovered": 2, 00:27:47.265 "num_base_bdevs_operational": 2, 00:27:47.265 "process": { 00:27:47.265 "type": "rebuild", 00:27:47.265 "target": "spare", 00:27:47.265 "progress": { 00:27:47.265 "blocks": 4352, 00:27:47.265 "percent": 54 00:27:47.265 } 00:27:47.265 }, 00:27:47.265 "base_bdevs_list": [ 00:27:47.265 { 00:27:47.265 "name": "spare", 00:27:47.265 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:47.265 "is_configured": true, 00:27:47.265 "data_offset": 256, 00:27:47.265 "data_size": 7936 00:27:47.265 }, 00:27:47.265 { 00:27:47.265 "name": "BaseBdev2", 00:27:47.265 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:47.265 "is_configured": true, 00:27:47.265 "data_offset": 256, 00:27:47.265 "data_size": 7936 00:27:47.265 } 00:27:47.265 ] 00:27:47.265 }' 00:27:47.265 10:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:47.265 10:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:47.265 10:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:47.265 10:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:47.265 10:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:48.196 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:48.196 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:48.196 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:48.196 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:48.196 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:48.196 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:48.196 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.196 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.452 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:48.452 "name": "raid_bdev1", 00:27:48.452 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:48.452 "strip_size_kb": 0, 00:27:48.452 "state": "online", 00:27:48.452 "raid_level": "raid1", 00:27:48.452 "superblock": true, 00:27:48.452 "num_base_bdevs": 2, 00:27:48.452 "num_base_bdevs_discovered": 2, 00:27:48.452 "num_base_bdevs_operational": 2, 00:27:48.452 "process": { 00:27:48.452 "type": "rebuild", 00:27:48.452 "target": "spare", 00:27:48.452 "progress": { 00:27:48.452 "blocks": 7680, 00:27:48.452 "percent": 96 00:27:48.452 } 00:27:48.452 }, 00:27:48.452 "base_bdevs_list": [ 00:27:48.452 { 00:27:48.452 "name": "spare", 00:27:48.452 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:48.452 "is_configured": true, 00:27:48.452 "data_offset": 256, 00:27:48.452 "data_size": 7936 00:27:48.452 }, 00:27:48.452 { 00:27:48.452 "name": "BaseBdev2", 00:27:48.452 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:48.452 "is_configured": true, 00:27:48.452 "data_offset": 256, 00:27:48.452 "data_size": 7936 00:27:48.452 } 00:27:48.452 ] 00:27:48.452 }' 00:27:48.452 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:48.452 [2024-07-12 10:54:23.604227] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:48.452 [2024-07-12 10:54:23.604281] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:48.452 [2024-07-12 10:54:23.604361] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:48.452 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:48.452 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:48.709 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:48.709 10:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:49.637 10:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:49.637 10:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:49.637 10:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:49.637 10:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:49.637 10:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:49.637 10:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:49.637 10:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.637 10:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.201 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:50.201 "name": "raid_bdev1", 00:27:50.201 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:50.202 "strip_size_kb": 0, 00:27:50.202 "state": "online", 00:27:50.202 "raid_level": "raid1", 00:27:50.202 "superblock": true, 00:27:50.202 "num_base_bdevs": 2, 00:27:50.202 "num_base_bdevs_discovered": 2, 00:27:50.202 "num_base_bdevs_operational": 2, 00:27:50.202 "base_bdevs_list": [ 00:27:50.202 { 00:27:50.202 "name": "spare", 00:27:50.202 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:50.202 "is_configured": true, 00:27:50.202 "data_offset": 256, 00:27:50.202 "data_size": 7936 00:27:50.202 }, 00:27:50.202 { 00:27:50.202 "name": "BaseBdev2", 00:27:50.202 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:50.202 "is_configured": true, 00:27:50.202 "data_offset": 256, 00:27:50.202 "data_size": 7936 00:27:50.202 } 00:27:50.202 ] 00:27:50.202 }' 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.202 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:50.460 "name": "raid_bdev1", 00:27:50.460 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:50.460 "strip_size_kb": 0, 00:27:50.460 "state": "online", 00:27:50.460 "raid_level": "raid1", 00:27:50.460 "superblock": true, 00:27:50.460 "num_base_bdevs": 2, 00:27:50.460 "num_base_bdevs_discovered": 2, 00:27:50.460 "num_base_bdevs_operational": 2, 00:27:50.460 "base_bdevs_list": [ 00:27:50.460 { 00:27:50.460 "name": "spare", 00:27:50.460 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:50.460 "is_configured": true, 00:27:50.460 "data_offset": 256, 00:27:50.460 "data_size": 7936 00:27:50.460 }, 00:27:50.460 { 00:27:50.460 "name": "BaseBdev2", 00:27:50.460 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:50.460 "is_configured": true, 00:27:50.460 "data_offset": 256, 00:27:50.460 "data_size": 7936 00:27:50.460 } 00:27:50.460 ] 00:27:50.460 }' 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.460 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.717 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.717 "name": "raid_bdev1", 00:27:50.717 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:50.717 "strip_size_kb": 0, 00:27:50.717 "state": "online", 00:27:50.717 "raid_level": "raid1", 00:27:50.717 "superblock": true, 00:27:50.717 "num_base_bdevs": 2, 00:27:50.717 "num_base_bdevs_discovered": 2, 00:27:50.717 "num_base_bdevs_operational": 2, 00:27:50.717 "base_bdevs_list": [ 00:27:50.717 { 00:27:50.717 "name": "spare", 00:27:50.717 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:50.717 "is_configured": true, 00:27:50.717 "data_offset": 256, 00:27:50.717 "data_size": 7936 00:27:50.717 }, 00:27:50.717 { 00:27:50.717 "name": "BaseBdev2", 00:27:50.717 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:50.717 "is_configured": true, 00:27:50.717 "data_offset": 256, 00:27:50.717 "data_size": 7936 00:27:50.717 } 00:27:50.717 ] 00:27:50.717 }' 00:27:50.717 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.717 10:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:51.282 10:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:51.847 [2024-07-12 10:54:26.892632] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:51.847 [2024-07-12 10:54:26.892659] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:51.847 [2024-07-12 10:54:26.892719] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:51.847 [2024-07-12 10:54:26.892774] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:51.847 [2024-07-12 10:54:26.892786] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24771c0 name raid_bdev1, state offline 00:27:51.847 10:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:27:51.847 10:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.103 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:52.103 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:52.103 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:52.103 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:52.103 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:52.103 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:52.103 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:52.104 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:52.104 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:52.104 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:52.104 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:52.104 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:52.104 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:52.360 /dev/nbd0 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:52.360 1+0 records in 00:27:52.360 1+0 records out 00:27:52.360 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272457 s, 15.0 MB/s 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:52.360 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:52.617 /dev/nbd1 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:52.617 1+0 records in 00:27:52.617 1+0 records out 00:27:52.617 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337271 s, 12.1 MB/s 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:52.617 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:52.875 10:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:52.875 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:52.875 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:52.875 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:52.875 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:52.875 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:52.875 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:52.875 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:52.875 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:52.875 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:53.132 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:53.132 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:53.132 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:53.132 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:53.132 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:53.132 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:53.132 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:53.133 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:53.133 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:53.133 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:53.389 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:53.646 [2024-07-12 10:54:28.757379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:53.646 [2024-07-12 10:54:28.757422] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:53.646 [2024-07-12 10:54:28.757444] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24769d0 00:27:53.646 [2024-07-12 10:54:28.757457] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:53.646 [2024-07-12 10:54:28.758934] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:53.646 [2024-07-12 10:54:28.758963] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:53.646 [2024-07-12 10:54:28.759022] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:53.646 [2024-07-12 10:54:28.759046] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:53.646 [2024-07-12 10:54:28.759142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:53.646 spare 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.646 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:53.903 [2024-07-12 10:54:28.859449] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23887c0 00:27:53.903 [2024-07-12 10:54:28.859464] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:53.903 [2024-07-12 10:54:28.859540] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2321180 00:27:53.903 [2024-07-12 10:54:28.859658] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23887c0 00:27:53.903 [2024-07-12 10:54:28.859670] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23887c0 00:27:53.903 [2024-07-12 10:54:28.859744] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:53.903 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:53.903 "name": "raid_bdev1", 00:27:53.903 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:53.903 "strip_size_kb": 0, 00:27:53.903 "state": "online", 00:27:53.903 "raid_level": "raid1", 00:27:53.903 "superblock": true, 00:27:53.903 "num_base_bdevs": 2, 00:27:53.903 "num_base_bdevs_discovered": 2, 00:27:53.903 "num_base_bdevs_operational": 2, 00:27:53.903 "base_bdevs_list": [ 00:27:53.903 { 00:27:53.903 "name": "spare", 00:27:53.903 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:53.903 "is_configured": true, 00:27:53.903 "data_offset": 256, 00:27:53.903 "data_size": 7936 00:27:53.903 }, 00:27:53.903 { 00:27:53.903 "name": "BaseBdev2", 00:27:53.903 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:53.903 "is_configured": true, 00:27:53.903 "data_offset": 256, 00:27:53.903 "data_size": 7936 00:27:53.903 } 00:27:53.903 ] 00:27:53.903 }' 00:27:53.903 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:53.903 10:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:54.468 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:54.468 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:54.468 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:54.468 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:54.468 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:54.468 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.468 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.725 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:54.725 "name": "raid_bdev1", 00:27:54.725 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:54.725 "strip_size_kb": 0, 00:27:54.725 "state": "online", 00:27:54.725 "raid_level": "raid1", 00:27:54.725 "superblock": true, 00:27:54.725 "num_base_bdevs": 2, 00:27:54.725 "num_base_bdevs_discovered": 2, 00:27:54.725 "num_base_bdevs_operational": 2, 00:27:54.725 "base_bdevs_list": [ 00:27:54.725 { 00:27:54.725 "name": "spare", 00:27:54.725 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:54.725 "is_configured": true, 00:27:54.725 "data_offset": 256, 00:27:54.725 "data_size": 7936 00:27:54.725 }, 00:27:54.725 { 00:27:54.725 "name": "BaseBdev2", 00:27:54.725 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:54.725 "is_configured": true, 00:27:54.725 "data_offset": 256, 00:27:54.725 "data_size": 7936 00:27:54.725 } 00:27:54.725 ] 00:27:54.725 }' 00:27:54.725 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:54.725 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:54.725 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:54.725 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:54.725 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:54.725 10:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.983 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:54.983 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:55.240 [2024-07-12 10:54:30.377783] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:55.240 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:55.240 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:55.240 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:55.241 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:55.241 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:55.241 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:55.241 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:55.241 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:55.241 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:55.241 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:55.241 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.241 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:55.498 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:55.498 "name": "raid_bdev1", 00:27:55.498 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:55.498 "strip_size_kb": 0, 00:27:55.498 "state": "online", 00:27:55.498 "raid_level": "raid1", 00:27:55.498 "superblock": true, 00:27:55.498 "num_base_bdevs": 2, 00:27:55.498 "num_base_bdevs_discovered": 1, 00:27:55.498 "num_base_bdevs_operational": 1, 00:27:55.498 "base_bdevs_list": [ 00:27:55.498 { 00:27:55.498 "name": null, 00:27:55.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:55.498 "is_configured": false, 00:27:55.498 "data_offset": 256, 00:27:55.498 "data_size": 7936 00:27:55.498 }, 00:27:55.498 { 00:27:55.498 "name": "BaseBdev2", 00:27:55.498 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:55.498 "is_configured": true, 00:27:55.498 "data_offset": 256, 00:27:55.498 "data_size": 7936 00:27:55.498 } 00:27:55.498 ] 00:27:55.498 }' 00:27:55.498 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:55.498 10:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:56.430 10:54:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:56.687 [2024-07-12 10:54:31.757435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:56.687 [2024-07-12 10:54:31.757593] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:56.687 [2024-07-12 10:54:31.757610] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:56.687 [2024-07-12 10:54:31.757637] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:56.687 [2024-07-12 10:54:31.759834] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2389a70 00:27:56.687 [2024-07-12 10:54:31.761162] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:56.687 10:54:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:57.619 10:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:57.619 10:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:57.619 10:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:57.619 10:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:57.619 10:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:57.619 10:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.619 10:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.877 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:57.877 "name": "raid_bdev1", 00:27:57.877 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:57.877 "strip_size_kb": 0, 00:27:57.877 "state": "online", 00:27:57.877 "raid_level": "raid1", 00:27:57.877 "superblock": true, 00:27:57.877 "num_base_bdevs": 2, 00:27:57.877 "num_base_bdevs_discovered": 2, 00:27:57.877 "num_base_bdevs_operational": 2, 00:27:57.877 "process": { 00:27:57.877 "type": "rebuild", 00:27:57.877 "target": "spare", 00:27:57.877 "progress": { 00:27:57.877 "blocks": 3072, 00:27:57.877 "percent": 38 00:27:57.877 } 00:27:57.877 }, 00:27:57.877 "base_bdevs_list": [ 00:27:57.877 { 00:27:57.877 "name": "spare", 00:27:57.877 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:27:57.877 "is_configured": true, 00:27:57.877 "data_offset": 256, 00:27:57.877 "data_size": 7936 00:27:57.877 }, 00:27:57.877 { 00:27:57.877 "name": "BaseBdev2", 00:27:57.877 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:57.877 "is_configured": true, 00:27:57.877 "data_offset": 256, 00:27:57.877 "data_size": 7936 00:27:57.877 } 00:27:57.877 ] 00:27:57.877 }' 00:27:57.877 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:58.133 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:58.133 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:58.133 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:58.133 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:58.390 [2024-07-12 10:54:33.354766] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:58.390 [2024-07-12 10:54:33.374101] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:58.390 [2024-07-12 10:54:33.374147] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:58.390 [2024-07-12 10:54:33.374162] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:58.390 [2024-07-12 10:54:33.374171] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.390 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:58.647 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:58.647 "name": "raid_bdev1", 00:27:58.647 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:27:58.647 "strip_size_kb": 0, 00:27:58.647 "state": "online", 00:27:58.647 "raid_level": "raid1", 00:27:58.647 "superblock": true, 00:27:58.647 "num_base_bdevs": 2, 00:27:58.647 "num_base_bdevs_discovered": 1, 00:27:58.647 "num_base_bdevs_operational": 1, 00:27:58.647 "base_bdevs_list": [ 00:27:58.647 { 00:27:58.647 "name": null, 00:27:58.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:58.647 "is_configured": false, 00:27:58.647 "data_offset": 256, 00:27:58.647 "data_size": 7936 00:27:58.647 }, 00:27:58.647 { 00:27:58.647 "name": "BaseBdev2", 00:27:58.647 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:27:58.647 "is_configured": true, 00:27:58.647 "data_offset": 256, 00:27:58.647 "data_size": 7936 00:27:58.647 } 00:27:58.647 ] 00:27:58.647 }' 00:27:58.647 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:58.647 10:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:59.244 10:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:59.501 [2024-07-12 10:54:34.472671] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:59.501 [2024-07-12 10:54:34.472723] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:59.501 [2024-07-12 10:54:34.472748] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238a690 00:27:59.501 [2024-07-12 10:54:34.472761] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:59.501 [2024-07-12 10:54:34.472975] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:59.501 [2024-07-12 10:54:34.472991] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:59.501 [2024-07-12 10:54:34.473051] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:59.501 [2024-07-12 10:54:34.473061] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:59.501 [2024-07-12 10:54:34.473072] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:59.501 [2024-07-12 10:54:34.473089] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:59.501 [2024-07-12 10:54:34.475267] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x238a920 00:27:59.501 [2024-07-12 10:54:34.476634] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:59.501 spare 00:27:59.501 10:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:00.434 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:00.434 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.434 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:00.434 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:00.434 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.434 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.434 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.692 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:00.692 "name": "raid_bdev1", 00:28:00.692 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:28:00.692 "strip_size_kb": 0, 00:28:00.692 "state": "online", 00:28:00.692 "raid_level": "raid1", 00:28:00.692 "superblock": true, 00:28:00.692 "num_base_bdevs": 2, 00:28:00.692 "num_base_bdevs_discovered": 2, 00:28:00.692 "num_base_bdevs_operational": 2, 00:28:00.692 "process": { 00:28:00.692 "type": "rebuild", 00:28:00.692 "target": "spare", 00:28:00.692 "progress": { 00:28:00.692 "blocks": 3072, 00:28:00.692 "percent": 38 00:28:00.692 } 00:28:00.692 }, 00:28:00.692 "base_bdevs_list": [ 00:28:00.692 { 00:28:00.692 "name": "spare", 00:28:00.692 "uuid": "f9ad63a8-bceb-5960-9701-5e875bd69ad6", 00:28:00.692 "is_configured": true, 00:28:00.692 "data_offset": 256, 00:28:00.692 "data_size": 7936 00:28:00.692 }, 00:28:00.692 { 00:28:00.692 "name": "BaseBdev2", 00:28:00.692 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:28:00.692 "is_configured": true, 00:28:00.692 "data_offset": 256, 00:28:00.692 "data_size": 7936 00:28:00.692 } 00:28:00.692 ] 00:28:00.692 }' 00:28:00.692 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:00.692 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:00.692 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.692 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:00.692 10:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:00.950 [2024-07-12 10:54:36.041834] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:00.950 [2024-07-12 10:54:36.089467] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:00.950 [2024-07-12 10:54:36.089518] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:00.950 [2024-07-12 10:54:36.089533] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:00.950 [2024-07-12 10:54:36.089542] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.951 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.208 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:01.208 "name": "raid_bdev1", 00:28:01.208 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:28:01.208 "strip_size_kb": 0, 00:28:01.208 "state": "online", 00:28:01.208 "raid_level": "raid1", 00:28:01.208 "superblock": true, 00:28:01.208 "num_base_bdevs": 2, 00:28:01.208 "num_base_bdevs_discovered": 1, 00:28:01.208 "num_base_bdevs_operational": 1, 00:28:01.208 "base_bdevs_list": [ 00:28:01.208 { 00:28:01.208 "name": null, 00:28:01.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:01.208 "is_configured": false, 00:28:01.208 "data_offset": 256, 00:28:01.208 "data_size": 7936 00:28:01.208 }, 00:28:01.208 { 00:28:01.208 "name": "BaseBdev2", 00:28:01.208 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:28:01.208 "is_configured": true, 00:28:01.208 "data_offset": 256, 00:28:01.208 "data_size": 7936 00:28:01.208 } 00:28:01.208 ] 00:28:01.208 }' 00:28:01.208 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:01.208 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:02.138 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:02.138 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:02.138 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:02.138 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:02.138 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:02.138 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.138 10:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.138 10:54:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:02.138 "name": "raid_bdev1", 00:28:02.138 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:28:02.138 "strip_size_kb": 0, 00:28:02.138 "state": "online", 00:28:02.138 "raid_level": "raid1", 00:28:02.138 "superblock": true, 00:28:02.138 "num_base_bdevs": 2, 00:28:02.138 "num_base_bdevs_discovered": 1, 00:28:02.138 "num_base_bdevs_operational": 1, 00:28:02.138 "base_bdevs_list": [ 00:28:02.138 { 00:28:02.138 "name": null, 00:28:02.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:02.138 "is_configured": false, 00:28:02.138 "data_offset": 256, 00:28:02.138 "data_size": 7936 00:28:02.138 }, 00:28:02.138 { 00:28:02.138 "name": "BaseBdev2", 00:28:02.138 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:28:02.138 "is_configured": true, 00:28:02.138 "data_offset": 256, 00:28:02.138 "data_size": 7936 00:28:02.138 } 00:28:02.138 ] 00:28:02.138 }' 00:28:02.138 10:54:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:02.138 10:54:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:02.138 10:54:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:02.138 10:54:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:02.138 10:54:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:02.394 10:54:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:02.650 [2024-07-12 10:54:37.789067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:02.650 [2024-07-12 10:54:37.789111] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:02.650 [2024-07-12 10:54:37.789132] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2322900 00:28:02.650 [2024-07-12 10:54:37.789144] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:02.650 [2024-07-12 10:54:37.789321] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:02.650 [2024-07-12 10:54:37.789338] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:02.650 [2024-07-12 10:54:37.789380] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:02.650 [2024-07-12 10:54:37.789391] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:02.650 [2024-07-12 10:54:37.789401] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:02.650 BaseBdev1 00:28:02.650 10:54:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.017 10:54:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:04.017 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:04.017 "name": "raid_bdev1", 00:28:04.017 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:28:04.017 "strip_size_kb": 0, 00:28:04.017 "state": "online", 00:28:04.017 "raid_level": "raid1", 00:28:04.017 "superblock": true, 00:28:04.017 "num_base_bdevs": 2, 00:28:04.017 "num_base_bdevs_discovered": 1, 00:28:04.017 "num_base_bdevs_operational": 1, 00:28:04.017 "base_bdevs_list": [ 00:28:04.017 { 00:28:04.017 "name": null, 00:28:04.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:04.017 "is_configured": false, 00:28:04.017 "data_offset": 256, 00:28:04.017 "data_size": 7936 00:28:04.017 }, 00:28:04.017 { 00:28:04.017 "name": "BaseBdev2", 00:28:04.017 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:28:04.017 "is_configured": true, 00:28:04.017 "data_offset": 256, 00:28:04.017 "data_size": 7936 00:28:04.017 } 00:28:04.017 ] 00:28:04.017 }' 00:28:04.017 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:04.017 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:04.579 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:04.579 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:04.579 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:04.579 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:04.579 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:04.579 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.579 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:04.838 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:04.838 "name": "raid_bdev1", 00:28:04.838 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:28:04.838 "strip_size_kb": 0, 00:28:04.838 "state": "online", 00:28:04.838 "raid_level": "raid1", 00:28:04.838 "superblock": true, 00:28:04.838 "num_base_bdevs": 2, 00:28:04.838 "num_base_bdevs_discovered": 1, 00:28:04.838 "num_base_bdevs_operational": 1, 00:28:04.838 "base_bdevs_list": [ 00:28:04.838 { 00:28:04.838 "name": null, 00:28:04.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:04.838 "is_configured": false, 00:28:04.838 "data_offset": 256, 00:28:04.838 "data_size": 7936 00:28:04.838 }, 00:28:04.838 { 00:28:04.839 "name": "BaseBdev2", 00:28:04.839 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:28:04.839 "is_configured": true, 00:28:04.839 "data_offset": 256, 00:28:04.839 "data_size": 7936 00:28:04.839 } 00:28:04.839 ] 00:28:04.839 }' 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:04.839 10:54:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:05.097 [2024-07-12 10:54:40.215567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:05.097 [2024-07-12 10:54:40.215686] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:05.097 [2024-07-12 10:54:40.215700] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:05.097 request: 00:28:05.097 { 00:28:05.097 "base_bdev": "BaseBdev1", 00:28:05.097 "raid_bdev": "raid_bdev1", 00:28:05.097 "method": "bdev_raid_add_base_bdev", 00:28:05.097 "req_id": 1 00:28:05.097 } 00:28:05.097 Got JSON-RPC error response 00:28:05.097 response: 00:28:05.097 { 00:28:05.097 "code": -22, 00:28:05.097 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:05.097 } 00:28:05.097 10:54:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:05.097 10:54:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:05.097 10:54:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:05.097 10:54:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:05.097 10:54:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:06.467 "name": "raid_bdev1", 00:28:06.467 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:28:06.467 "strip_size_kb": 0, 00:28:06.467 "state": "online", 00:28:06.467 "raid_level": "raid1", 00:28:06.467 "superblock": true, 00:28:06.467 "num_base_bdevs": 2, 00:28:06.467 "num_base_bdevs_discovered": 1, 00:28:06.467 "num_base_bdevs_operational": 1, 00:28:06.467 "base_bdevs_list": [ 00:28:06.467 { 00:28:06.467 "name": null, 00:28:06.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:06.467 "is_configured": false, 00:28:06.467 "data_offset": 256, 00:28:06.467 "data_size": 7936 00:28:06.467 }, 00:28:06.467 { 00:28:06.467 "name": "BaseBdev2", 00:28:06.467 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:28:06.467 "is_configured": true, 00:28:06.467 "data_offset": 256, 00:28:06.467 "data_size": 7936 00:28:06.467 } 00:28:06.467 ] 00:28:06.467 }' 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:06.467 10:54:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:07.033 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:07.033 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:07.033 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:07.033 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:07.033 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:07.033 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.033 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:07.292 "name": "raid_bdev1", 00:28:07.292 "uuid": "b9ace996-72a8-4462-b384-1b8022624220", 00:28:07.292 "strip_size_kb": 0, 00:28:07.292 "state": "online", 00:28:07.292 "raid_level": "raid1", 00:28:07.292 "superblock": true, 00:28:07.292 "num_base_bdevs": 2, 00:28:07.292 "num_base_bdevs_discovered": 1, 00:28:07.292 "num_base_bdevs_operational": 1, 00:28:07.292 "base_bdevs_list": [ 00:28:07.292 { 00:28:07.292 "name": null, 00:28:07.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.292 "is_configured": false, 00:28:07.292 "data_offset": 256, 00:28:07.292 "data_size": 7936 00:28:07.292 }, 00:28:07.292 { 00:28:07.292 "name": "BaseBdev2", 00:28:07.292 "uuid": "9b027f54-d1a5-5517-a384-451abf21269a", 00:28:07.292 "is_configured": true, 00:28:07.292 "data_offset": 256, 00:28:07.292 "data_size": 7936 00:28:07.292 } 00:28:07.292 ] 00:28:07.292 }' 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 2165755 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2165755 ']' 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2165755 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2165755 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2165755' 00:28:07.292 killing process with pid 2165755 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2165755 00:28:07.292 Received shutdown signal, test time was about 60.000000 seconds 00:28:07.292 00:28:07.292 Latency(us) 00:28:07.292 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:07.292 =================================================================================================================== 00:28:07.292 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:07.292 [2024-07-12 10:54:42.407355] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:07.292 [2024-07-12 10:54:42.407440] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:07.292 [2024-07-12 10:54:42.407495] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:07.292 [2024-07-12 10:54:42.407508] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23887c0 name raid_bdev1, state offline 00:28:07.292 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2165755 00:28:07.292 [2024-07-12 10:54:42.440589] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:07.550 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:28:07.550 00:28:07.550 real 0m31.911s 00:28:07.550 user 0m50.099s 00:28:07.550 sys 0m5.056s 00:28:07.550 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:07.550 10:54:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:07.550 ************************************ 00:28:07.550 END TEST raid_rebuild_test_sb_md_separate 00:28:07.550 ************************************ 00:28:07.550 10:54:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:07.550 10:54:42 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:28:07.550 10:54:42 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:28:07.550 10:54:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:07.550 10:54:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:07.550 10:54:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:07.550 ************************************ 00:28:07.550 START TEST raid_state_function_test_sb_md_interleaved 00:28:07.550 ************************************ 00:28:07.550 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:07.550 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:07.550 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:07.550 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:07.550 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:07.550 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:07.550 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:07.550 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:07.550 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=2170408 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2170408' 00:28:07.551 Process raid pid: 2170408 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 2170408 /var/tmp/spdk-raid.sock 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2170408 ']' 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:07.551 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:07.551 10:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:07.808 [2024-07-12 10:54:42.795661] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:28:07.808 [2024-07-12 10:54:42.795724] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:07.808 [2024-07-12 10:54:42.914319] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:08.065 [2024-07-12 10:54:43.018083] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:08.065 [2024-07-12 10:54:43.078003] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:08.065 [2024-07-12 10:54:43.078031] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:08.629 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:08.629 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:08.629 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:08.885 [2024-07-12 10:54:43.959359] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:08.885 [2024-07-12 10:54:43.959402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:08.885 [2024-07-12 10:54:43.959413] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:08.885 [2024-07-12 10:54:43.959425] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:08.885 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.886 10:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:09.143 10:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:09.143 "name": "Existed_Raid", 00:28:09.143 "uuid": "1a5f8430-dcff-4f8b-98f4-38c61b8bbbf2", 00:28:09.143 "strip_size_kb": 0, 00:28:09.143 "state": "configuring", 00:28:09.143 "raid_level": "raid1", 00:28:09.143 "superblock": true, 00:28:09.143 "num_base_bdevs": 2, 00:28:09.143 "num_base_bdevs_discovered": 0, 00:28:09.143 "num_base_bdevs_operational": 2, 00:28:09.143 "base_bdevs_list": [ 00:28:09.143 { 00:28:09.143 "name": "BaseBdev1", 00:28:09.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.143 "is_configured": false, 00:28:09.143 "data_offset": 0, 00:28:09.143 "data_size": 0 00:28:09.143 }, 00:28:09.143 { 00:28:09.143 "name": "BaseBdev2", 00:28:09.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.143 "is_configured": false, 00:28:09.143 "data_offset": 0, 00:28:09.143 "data_size": 0 00:28:09.143 } 00:28:09.143 ] 00:28:09.143 }' 00:28:09.143 10:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:09.143 10:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:09.708 10:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:09.965 [2024-07-12 10:54:45.042091] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:09.965 [2024-07-12 10:54:45.042122] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d5a80 name Existed_Raid, state configuring 00:28:09.965 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:10.223 [2024-07-12 10:54:45.274729] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:10.223 [2024-07-12 10:54:45.274762] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:10.223 [2024-07-12 10:54:45.274771] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:10.223 [2024-07-12 10:54:45.274783] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:10.223 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:28:10.481 [2024-07-12 10:54:45.529507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:10.481 BaseBdev1 00:28:10.481 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:10.481 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:10.481 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:10.481 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:10.481 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:10.481 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:10.481 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:10.739 10:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:10.997 [ 00:28:10.997 { 00:28:10.997 "name": "BaseBdev1", 00:28:10.997 "aliases": [ 00:28:10.997 "c31229e4-e6c1-4049-8c5c-9d281e5f1af6" 00:28:10.997 ], 00:28:10.997 "product_name": "Malloc disk", 00:28:10.997 "block_size": 4128, 00:28:10.997 "num_blocks": 8192, 00:28:10.997 "uuid": "c31229e4-e6c1-4049-8c5c-9d281e5f1af6", 00:28:10.997 "md_size": 32, 00:28:10.997 "md_interleave": true, 00:28:10.997 "dif_type": 0, 00:28:10.997 "assigned_rate_limits": { 00:28:10.997 "rw_ios_per_sec": 0, 00:28:10.997 "rw_mbytes_per_sec": 0, 00:28:10.997 "r_mbytes_per_sec": 0, 00:28:10.997 "w_mbytes_per_sec": 0 00:28:10.997 }, 00:28:10.997 "claimed": true, 00:28:10.997 "claim_type": "exclusive_write", 00:28:10.997 "zoned": false, 00:28:10.997 "supported_io_types": { 00:28:10.997 "read": true, 00:28:10.997 "write": true, 00:28:10.997 "unmap": true, 00:28:10.997 "flush": true, 00:28:10.997 "reset": true, 00:28:10.997 "nvme_admin": false, 00:28:10.997 "nvme_io": false, 00:28:10.997 "nvme_io_md": false, 00:28:10.997 "write_zeroes": true, 00:28:10.997 "zcopy": true, 00:28:10.997 "get_zone_info": false, 00:28:10.997 "zone_management": false, 00:28:10.997 "zone_append": false, 00:28:10.997 "compare": false, 00:28:10.997 "compare_and_write": false, 00:28:10.997 "abort": true, 00:28:10.997 "seek_hole": false, 00:28:10.997 "seek_data": false, 00:28:10.997 "copy": true, 00:28:10.997 "nvme_iov_md": false 00:28:10.997 }, 00:28:10.997 "memory_domains": [ 00:28:10.997 { 00:28:10.997 "dma_device_id": "system", 00:28:10.997 "dma_device_type": 1 00:28:10.997 }, 00:28:10.997 { 00:28:10.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:10.997 "dma_device_type": 2 00:28:10.997 } 00:28:10.997 ], 00:28:10.997 "driver_specific": {} 00:28:10.997 } 00:28:10.997 ] 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.997 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:11.255 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.255 "name": "Existed_Raid", 00:28:11.255 "uuid": "4b46055c-4f48-4aae-ba31-0340a9b06bf8", 00:28:11.255 "strip_size_kb": 0, 00:28:11.255 "state": "configuring", 00:28:11.255 "raid_level": "raid1", 00:28:11.255 "superblock": true, 00:28:11.255 "num_base_bdevs": 2, 00:28:11.255 "num_base_bdevs_discovered": 1, 00:28:11.255 "num_base_bdevs_operational": 2, 00:28:11.255 "base_bdevs_list": [ 00:28:11.255 { 00:28:11.255 "name": "BaseBdev1", 00:28:11.255 "uuid": "c31229e4-e6c1-4049-8c5c-9d281e5f1af6", 00:28:11.255 "is_configured": true, 00:28:11.255 "data_offset": 256, 00:28:11.255 "data_size": 7936 00:28:11.255 }, 00:28:11.255 { 00:28:11.255 "name": "BaseBdev2", 00:28:11.255 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:11.255 "is_configured": false, 00:28:11.255 "data_offset": 0, 00:28:11.255 "data_size": 0 00:28:11.255 } 00:28:11.255 ] 00:28:11.255 }' 00:28:11.255 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.255 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:11.820 10:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:12.078 [2024-07-12 10:54:47.097646] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:12.078 [2024-07-12 10:54:47.097684] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d5350 name Existed_Raid, state configuring 00:28:12.078 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:12.335 [2024-07-12 10:54:47.342330] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:12.335 [2024-07-12 10:54:47.343803] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:12.335 [2024-07-12 10:54:47.343834] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.335 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:12.593 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:12.593 "name": "Existed_Raid", 00:28:12.593 "uuid": "97550cc8-a288-4de2-85ce-cba38cf3e478", 00:28:12.593 "strip_size_kb": 0, 00:28:12.593 "state": "configuring", 00:28:12.593 "raid_level": "raid1", 00:28:12.593 "superblock": true, 00:28:12.593 "num_base_bdevs": 2, 00:28:12.593 "num_base_bdevs_discovered": 1, 00:28:12.593 "num_base_bdevs_operational": 2, 00:28:12.593 "base_bdevs_list": [ 00:28:12.593 { 00:28:12.593 "name": "BaseBdev1", 00:28:12.593 "uuid": "c31229e4-e6c1-4049-8c5c-9d281e5f1af6", 00:28:12.593 "is_configured": true, 00:28:12.593 "data_offset": 256, 00:28:12.593 "data_size": 7936 00:28:12.593 }, 00:28:12.593 { 00:28:12.593 "name": "BaseBdev2", 00:28:12.593 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:12.593 "is_configured": false, 00:28:12.593 "data_offset": 0, 00:28:12.593 "data_size": 0 00:28:12.593 } 00:28:12.593 ] 00:28:12.593 }' 00:28:12.593 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:12.593 10:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:13.158 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:28:13.416 [2024-07-12 10:54:48.476846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:13.416 [2024-07-12 10:54:48.476974] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x7d7180 00:28:13.416 [2024-07-12 10:54:48.476987] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:13.416 [2024-07-12 10:54:48.477045] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7d7150 00:28:13.416 [2024-07-12 10:54:48.477117] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7d7180 00:28:13.416 [2024-07-12 10:54:48.477127] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7d7180 00:28:13.416 [2024-07-12 10:54:48.477183] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:13.416 BaseBdev2 00:28:13.416 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:13.416 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:13.416 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:13.416 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:13.416 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:13.416 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:13.416 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:13.674 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:13.932 [ 00:28:13.932 { 00:28:13.932 "name": "BaseBdev2", 00:28:13.932 "aliases": [ 00:28:13.932 "91331d05-a6ac-47c3-9cbe-c5db7a41d857" 00:28:13.932 ], 00:28:13.932 "product_name": "Malloc disk", 00:28:13.932 "block_size": 4128, 00:28:13.932 "num_blocks": 8192, 00:28:13.932 "uuid": "91331d05-a6ac-47c3-9cbe-c5db7a41d857", 00:28:13.932 "md_size": 32, 00:28:13.932 "md_interleave": true, 00:28:13.932 "dif_type": 0, 00:28:13.932 "assigned_rate_limits": { 00:28:13.932 "rw_ios_per_sec": 0, 00:28:13.932 "rw_mbytes_per_sec": 0, 00:28:13.932 "r_mbytes_per_sec": 0, 00:28:13.932 "w_mbytes_per_sec": 0 00:28:13.932 }, 00:28:13.932 "claimed": true, 00:28:13.932 "claim_type": "exclusive_write", 00:28:13.932 "zoned": false, 00:28:13.932 "supported_io_types": { 00:28:13.932 "read": true, 00:28:13.932 "write": true, 00:28:13.932 "unmap": true, 00:28:13.932 "flush": true, 00:28:13.932 "reset": true, 00:28:13.932 "nvme_admin": false, 00:28:13.932 "nvme_io": false, 00:28:13.932 "nvme_io_md": false, 00:28:13.932 "write_zeroes": true, 00:28:13.932 "zcopy": true, 00:28:13.932 "get_zone_info": false, 00:28:13.932 "zone_management": false, 00:28:13.932 "zone_append": false, 00:28:13.932 "compare": false, 00:28:13.932 "compare_and_write": false, 00:28:13.932 "abort": true, 00:28:13.932 "seek_hole": false, 00:28:13.932 "seek_data": false, 00:28:13.932 "copy": true, 00:28:13.932 "nvme_iov_md": false 00:28:13.932 }, 00:28:13.932 "memory_domains": [ 00:28:13.932 { 00:28:13.932 "dma_device_id": "system", 00:28:13.932 "dma_device_type": 1 00:28:13.932 }, 00:28:13.932 { 00:28:13.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:13.932 "dma_device_type": 2 00:28:13.932 } 00:28:13.932 ], 00:28:13.932 "driver_specific": {} 00:28:13.932 } 00:28:13.932 ] 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.932 10:54:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:14.190 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:14.190 "name": "Existed_Raid", 00:28:14.190 "uuid": "97550cc8-a288-4de2-85ce-cba38cf3e478", 00:28:14.190 "strip_size_kb": 0, 00:28:14.190 "state": "online", 00:28:14.190 "raid_level": "raid1", 00:28:14.190 "superblock": true, 00:28:14.190 "num_base_bdevs": 2, 00:28:14.190 "num_base_bdevs_discovered": 2, 00:28:14.190 "num_base_bdevs_operational": 2, 00:28:14.190 "base_bdevs_list": [ 00:28:14.190 { 00:28:14.190 "name": "BaseBdev1", 00:28:14.190 "uuid": "c31229e4-e6c1-4049-8c5c-9d281e5f1af6", 00:28:14.190 "is_configured": true, 00:28:14.190 "data_offset": 256, 00:28:14.190 "data_size": 7936 00:28:14.190 }, 00:28:14.190 { 00:28:14.190 "name": "BaseBdev2", 00:28:14.190 "uuid": "91331d05-a6ac-47c3-9cbe-c5db7a41d857", 00:28:14.190 "is_configured": true, 00:28:14.191 "data_offset": 256, 00:28:14.191 "data_size": 7936 00:28:14.191 } 00:28:14.191 ] 00:28:14.191 }' 00:28:14.191 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:14.191 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:14.755 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:14.755 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:14.755 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:14.755 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:14.755 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:14.755 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:14.756 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:14.756 10:54:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:15.013 [2024-07-12 10:54:50.093431] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:15.013 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:15.013 "name": "Existed_Raid", 00:28:15.013 "aliases": [ 00:28:15.013 "97550cc8-a288-4de2-85ce-cba38cf3e478" 00:28:15.013 ], 00:28:15.013 "product_name": "Raid Volume", 00:28:15.013 "block_size": 4128, 00:28:15.013 "num_blocks": 7936, 00:28:15.013 "uuid": "97550cc8-a288-4de2-85ce-cba38cf3e478", 00:28:15.013 "md_size": 32, 00:28:15.013 "md_interleave": true, 00:28:15.013 "dif_type": 0, 00:28:15.013 "assigned_rate_limits": { 00:28:15.013 "rw_ios_per_sec": 0, 00:28:15.013 "rw_mbytes_per_sec": 0, 00:28:15.013 "r_mbytes_per_sec": 0, 00:28:15.013 "w_mbytes_per_sec": 0 00:28:15.013 }, 00:28:15.013 "claimed": false, 00:28:15.013 "zoned": false, 00:28:15.013 "supported_io_types": { 00:28:15.013 "read": true, 00:28:15.013 "write": true, 00:28:15.013 "unmap": false, 00:28:15.013 "flush": false, 00:28:15.013 "reset": true, 00:28:15.013 "nvme_admin": false, 00:28:15.013 "nvme_io": false, 00:28:15.013 "nvme_io_md": false, 00:28:15.013 "write_zeroes": true, 00:28:15.013 "zcopy": false, 00:28:15.013 "get_zone_info": false, 00:28:15.013 "zone_management": false, 00:28:15.013 "zone_append": false, 00:28:15.013 "compare": false, 00:28:15.014 "compare_and_write": false, 00:28:15.014 "abort": false, 00:28:15.014 "seek_hole": false, 00:28:15.014 "seek_data": false, 00:28:15.014 "copy": false, 00:28:15.014 "nvme_iov_md": false 00:28:15.014 }, 00:28:15.014 "memory_domains": [ 00:28:15.014 { 00:28:15.014 "dma_device_id": "system", 00:28:15.014 "dma_device_type": 1 00:28:15.014 }, 00:28:15.014 { 00:28:15.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:15.014 "dma_device_type": 2 00:28:15.014 }, 00:28:15.014 { 00:28:15.014 "dma_device_id": "system", 00:28:15.014 "dma_device_type": 1 00:28:15.014 }, 00:28:15.014 { 00:28:15.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:15.014 "dma_device_type": 2 00:28:15.014 } 00:28:15.014 ], 00:28:15.014 "driver_specific": { 00:28:15.014 "raid": { 00:28:15.014 "uuid": "97550cc8-a288-4de2-85ce-cba38cf3e478", 00:28:15.014 "strip_size_kb": 0, 00:28:15.014 "state": "online", 00:28:15.014 "raid_level": "raid1", 00:28:15.014 "superblock": true, 00:28:15.014 "num_base_bdevs": 2, 00:28:15.014 "num_base_bdevs_discovered": 2, 00:28:15.014 "num_base_bdevs_operational": 2, 00:28:15.014 "base_bdevs_list": [ 00:28:15.014 { 00:28:15.014 "name": "BaseBdev1", 00:28:15.014 "uuid": "c31229e4-e6c1-4049-8c5c-9d281e5f1af6", 00:28:15.014 "is_configured": true, 00:28:15.014 "data_offset": 256, 00:28:15.014 "data_size": 7936 00:28:15.014 }, 00:28:15.014 { 00:28:15.014 "name": "BaseBdev2", 00:28:15.014 "uuid": "91331d05-a6ac-47c3-9cbe-c5db7a41d857", 00:28:15.014 "is_configured": true, 00:28:15.014 "data_offset": 256, 00:28:15.014 "data_size": 7936 00:28:15.014 } 00:28:15.014 ] 00:28:15.014 } 00:28:15.014 } 00:28:15.014 }' 00:28:15.014 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:15.014 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:15.014 BaseBdev2' 00:28:15.014 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:15.014 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:15.014 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:15.271 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:15.271 "name": "BaseBdev1", 00:28:15.271 "aliases": [ 00:28:15.271 "c31229e4-e6c1-4049-8c5c-9d281e5f1af6" 00:28:15.271 ], 00:28:15.271 "product_name": "Malloc disk", 00:28:15.271 "block_size": 4128, 00:28:15.271 "num_blocks": 8192, 00:28:15.271 "uuid": "c31229e4-e6c1-4049-8c5c-9d281e5f1af6", 00:28:15.271 "md_size": 32, 00:28:15.271 "md_interleave": true, 00:28:15.271 "dif_type": 0, 00:28:15.271 "assigned_rate_limits": { 00:28:15.271 "rw_ios_per_sec": 0, 00:28:15.271 "rw_mbytes_per_sec": 0, 00:28:15.271 "r_mbytes_per_sec": 0, 00:28:15.271 "w_mbytes_per_sec": 0 00:28:15.271 }, 00:28:15.271 "claimed": true, 00:28:15.271 "claim_type": "exclusive_write", 00:28:15.271 "zoned": false, 00:28:15.271 "supported_io_types": { 00:28:15.271 "read": true, 00:28:15.271 "write": true, 00:28:15.271 "unmap": true, 00:28:15.271 "flush": true, 00:28:15.271 "reset": true, 00:28:15.271 "nvme_admin": false, 00:28:15.272 "nvme_io": false, 00:28:15.272 "nvme_io_md": false, 00:28:15.272 "write_zeroes": true, 00:28:15.272 "zcopy": true, 00:28:15.272 "get_zone_info": false, 00:28:15.272 "zone_management": false, 00:28:15.272 "zone_append": false, 00:28:15.272 "compare": false, 00:28:15.272 "compare_and_write": false, 00:28:15.272 "abort": true, 00:28:15.272 "seek_hole": false, 00:28:15.272 "seek_data": false, 00:28:15.272 "copy": true, 00:28:15.272 "nvme_iov_md": false 00:28:15.272 }, 00:28:15.272 "memory_domains": [ 00:28:15.272 { 00:28:15.272 "dma_device_id": "system", 00:28:15.272 "dma_device_type": 1 00:28:15.272 }, 00:28:15.272 { 00:28:15.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:15.272 "dma_device_type": 2 00:28:15.272 } 00:28:15.272 ], 00:28:15.272 "driver_specific": {} 00:28:15.272 }' 00:28:15.272 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:15.272 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:15.529 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:15.529 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:15.529 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:15.529 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:15.529 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:15.529 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:15.529 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:15.529 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:15.786 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:15.786 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:15.786 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:15.786 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:15.786 10:54:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:16.043 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:16.043 "name": "BaseBdev2", 00:28:16.043 "aliases": [ 00:28:16.043 "91331d05-a6ac-47c3-9cbe-c5db7a41d857" 00:28:16.043 ], 00:28:16.043 "product_name": "Malloc disk", 00:28:16.043 "block_size": 4128, 00:28:16.043 "num_blocks": 8192, 00:28:16.043 "uuid": "91331d05-a6ac-47c3-9cbe-c5db7a41d857", 00:28:16.043 "md_size": 32, 00:28:16.043 "md_interleave": true, 00:28:16.043 "dif_type": 0, 00:28:16.043 "assigned_rate_limits": { 00:28:16.043 "rw_ios_per_sec": 0, 00:28:16.043 "rw_mbytes_per_sec": 0, 00:28:16.043 "r_mbytes_per_sec": 0, 00:28:16.043 "w_mbytes_per_sec": 0 00:28:16.043 }, 00:28:16.043 "claimed": true, 00:28:16.043 "claim_type": "exclusive_write", 00:28:16.043 "zoned": false, 00:28:16.043 "supported_io_types": { 00:28:16.043 "read": true, 00:28:16.043 "write": true, 00:28:16.043 "unmap": true, 00:28:16.043 "flush": true, 00:28:16.043 "reset": true, 00:28:16.043 "nvme_admin": false, 00:28:16.043 "nvme_io": false, 00:28:16.043 "nvme_io_md": false, 00:28:16.043 "write_zeroes": true, 00:28:16.043 "zcopy": true, 00:28:16.043 "get_zone_info": false, 00:28:16.043 "zone_management": false, 00:28:16.043 "zone_append": false, 00:28:16.043 "compare": false, 00:28:16.043 "compare_and_write": false, 00:28:16.043 "abort": true, 00:28:16.043 "seek_hole": false, 00:28:16.043 "seek_data": false, 00:28:16.043 "copy": true, 00:28:16.043 "nvme_iov_md": false 00:28:16.043 }, 00:28:16.043 "memory_domains": [ 00:28:16.043 { 00:28:16.043 "dma_device_id": "system", 00:28:16.043 "dma_device_type": 1 00:28:16.043 }, 00:28:16.043 { 00:28:16.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:16.043 "dma_device_type": 2 00:28:16.043 } 00:28:16.043 ], 00:28:16.043 "driver_specific": {} 00:28:16.043 }' 00:28:16.043 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:16.043 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:16.043 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:16.043 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:16.043 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:16.043 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:16.043 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:16.043 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:16.337 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:16.337 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:16.337 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:16.337 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:16.337 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:16.608 [2024-07-12 10:54:51.553050] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:16.608 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.866 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:16.866 "name": "Existed_Raid", 00:28:16.866 "uuid": "97550cc8-a288-4de2-85ce-cba38cf3e478", 00:28:16.866 "strip_size_kb": 0, 00:28:16.866 "state": "online", 00:28:16.866 "raid_level": "raid1", 00:28:16.866 "superblock": true, 00:28:16.866 "num_base_bdevs": 2, 00:28:16.866 "num_base_bdevs_discovered": 1, 00:28:16.866 "num_base_bdevs_operational": 1, 00:28:16.866 "base_bdevs_list": [ 00:28:16.866 { 00:28:16.866 "name": null, 00:28:16.866 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:16.866 "is_configured": false, 00:28:16.866 "data_offset": 256, 00:28:16.866 "data_size": 7936 00:28:16.866 }, 00:28:16.866 { 00:28:16.866 "name": "BaseBdev2", 00:28:16.866 "uuid": "91331d05-a6ac-47c3-9cbe-c5db7a41d857", 00:28:16.866 "is_configured": true, 00:28:16.866 "data_offset": 256, 00:28:16.866 "data_size": 7936 00:28:16.866 } 00:28:16.866 ] 00:28:16.866 }' 00:28:16.866 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:16.866 10:54:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:17.799 10:54:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:17.799 10:54:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:17.800 10:54:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.800 10:54:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:17.800 10:54:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:17.800 10:54:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:17.800 10:54:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:18.057 [2024-07-12 10:54:53.158387] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:18.057 [2024-07-12 10:54:53.158470] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:18.057 [2024-07-12 10:54:53.169760] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:18.057 [2024-07-12 10:54:53.169792] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:18.057 [2024-07-12 10:54:53.169803] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d7180 name Existed_Raid, state offline 00:28:18.057 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:18.057 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:18.057 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.057 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 2170408 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2170408 ']' 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2170408 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2170408 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2170408' 00:28:18.314 killing process with pid 2170408 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2170408 00:28:18.314 [2024-07-12 10:54:53.430403] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:18.314 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2170408 00:28:18.314 [2024-07-12 10:54:53.431308] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:18.572 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:28:18.572 00:28:18.572 real 0m10.923s 00:28:18.572 user 0m19.459s 00:28:18.572 sys 0m2.008s 00:28:18.572 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:18.572 10:54:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:18.572 ************************************ 00:28:18.572 END TEST raid_state_function_test_sb_md_interleaved 00:28:18.572 ************************************ 00:28:18.572 10:54:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:18.572 10:54:53 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:28:18.572 10:54:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:18.572 10:54:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:18.572 10:54:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:18.572 ************************************ 00:28:18.572 START TEST raid_superblock_test_md_interleaved 00:28:18.572 ************************************ 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=2172036 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 2172036 /var/tmp/spdk-raid.sock 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2172036 ']' 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:18.572 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:18.572 10:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:18.829 [2024-07-12 10:54:53.809077] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:28:18.829 [2024-07-12 10:54:53.809147] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2172036 ] 00:28:18.829 [2024-07-12 10:54:53.937860] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:19.086 [2024-07-12 10:54:54.038342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:19.086 [2024-07-12 10:54:54.097951] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:19.086 [2024-07-12 10:54:54.097988] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:19.649 10:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:28:19.906 malloc1 00:28:19.906 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:20.164 [2024-07-12 10:54:55.233789] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:20.164 [2024-07-12 10:54:55.233837] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:20.164 [2024-07-12 10:54:55.233859] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21144e0 00:28:20.164 [2024-07-12 10:54:55.233872] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:20.164 [2024-07-12 10:54:55.235290] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:20.164 [2024-07-12 10:54:55.235318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:20.164 pt1 00:28:20.164 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:20.164 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:20.164 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:20.164 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:20.164 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:20.164 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:20.164 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:20.164 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:20.164 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:28:20.421 malloc2 00:28:20.421 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:20.678 [2024-07-12 10:54:55.732085] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:20.678 [2024-07-12 10:54:55.732131] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:20.678 [2024-07-12 10:54:55.732150] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f9570 00:28:20.678 [2024-07-12 10:54:55.732163] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:20.678 [2024-07-12 10:54:55.733490] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:20.678 [2024-07-12 10:54:55.733517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:20.678 pt2 00:28:20.678 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:20.678 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:20.678 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:20.935 [2024-07-12 10:54:55.968726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:20.935 [2024-07-12 10:54:55.970045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:20.935 [2024-07-12 10:54:55.970186] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20faf20 00:28:20.935 [2024-07-12 10:54:55.970199] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:20.935 [2024-07-12 10:54:55.970259] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f77050 00:28:20.935 [2024-07-12 10:54:55.970338] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20faf20 00:28:20.935 [2024-07-12 10:54:55.970348] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20faf20 00:28:20.935 [2024-07-12 10:54:55.970400] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.935 10:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.192 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:21.192 "name": "raid_bdev1", 00:28:21.192 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:21.192 "strip_size_kb": 0, 00:28:21.192 "state": "online", 00:28:21.192 "raid_level": "raid1", 00:28:21.192 "superblock": true, 00:28:21.192 "num_base_bdevs": 2, 00:28:21.192 "num_base_bdevs_discovered": 2, 00:28:21.192 "num_base_bdevs_operational": 2, 00:28:21.192 "base_bdevs_list": [ 00:28:21.192 { 00:28:21.192 "name": "pt1", 00:28:21.192 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:21.192 "is_configured": true, 00:28:21.192 "data_offset": 256, 00:28:21.192 "data_size": 7936 00:28:21.192 }, 00:28:21.192 { 00:28:21.192 "name": "pt2", 00:28:21.192 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:21.192 "is_configured": true, 00:28:21.192 "data_offset": 256, 00:28:21.192 "data_size": 7936 00:28:21.192 } 00:28:21.192 ] 00:28:21.192 }' 00:28:21.192 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:21.192 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:21.757 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:21.757 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:21.757 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:21.757 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:21.757 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:21.757 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:21.757 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:21.757 10:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:22.015 [2024-07-12 10:54:57.043804] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:22.015 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:22.015 "name": "raid_bdev1", 00:28:22.015 "aliases": [ 00:28:22.015 "343d1f35-3876-487c-829c-762cc5f1b7c2" 00:28:22.015 ], 00:28:22.015 "product_name": "Raid Volume", 00:28:22.015 "block_size": 4128, 00:28:22.015 "num_blocks": 7936, 00:28:22.015 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:22.015 "md_size": 32, 00:28:22.015 "md_interleave": true, 00:28:22.015 "dif_type": 0, 00:28:22.015 "assigned_rate_limits": { 00:28:22.015 "rw_ios_per_sec": 0, 00:28:22.015 "rw_mbytes_per_sec": 0, 00:28:22.015 "r_mbytes_per_sec": 0, 00:28:22.015 "w_mbytes_per_sec": 0 00:28:22.015 }, 00:28:22.015 "claimed": false, 00:28:22.015 "zoned": false, 00:28:22.015 "supported_io_types": { 00:28:22.015 "read": true, 00:28:22.015 "write": true, 00:28:22.015 "unmap": false, 00:28:22.015 "flush": false, 00:28:22.015 "reset": true, 00:28:22.015 "nvme_admin": false, 00:28:22.015 "nvme_io": false, 00:28:22.015 "nvme_io_md": false, 00:28:22.015 "write_zeroes": true, 00:28:22.015 "zcopy": false, 00:28:22.015 "get_zone_info": false, 00:28:22.015 "zone_management": false, 00:28:22.015 "zone_append": false, 00:28:22.015 "compare": false, 00:28:22.015 "compare_and_write": false, 00:28:22.015 "abort": false, 00:28:22.015 "seek_hole": false, 00:28:22.015 "seek_data": false, 00:28:22.015 "copy": false, 00:28:22.015 "nvme_iov_md": false 00:28:22.015 }, 00:28:22.015 "memory_domains": [ 00:28:22.015 { 00:28:22.015 "dma_device_id": "system", 00:28:22.015 "dma_device_type": 1 00:28:22.015 }, 00:28:22.015 { 00:28:22.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:22.015 "dma_device_type": 2 00:28:22.015 }, 00:28:22.015 { 00:28:22.015 "dma_device_id": "system", 00:28:22.015 "dma_device_type": 1 00:28:22.015 }, 00:28:22.015 { 00:28:22.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:22.015 "dma_device_type": 2 00:28:22.015 } 00:28:22.015 ], 00:28:22.015 "driver_specific": { 00:28:22.015 "raid": { 00:28:22.015 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:22.015 "strip_size_kb": 0, 00:28:22.015 "state": "online", 00:28:22.015 "raid_level": "raid1", 00:28:22.015 "superblock": true, 00:28:22.015 "num_base_bdevs": 2, 00:28:22.015 "num_base_bdevs_discovered": 2, 00:28:22.015 "num_base_bdevs_operational": 2, 00:28:22.015 "base_bdevs_list": [ 00:28:22.015 { 00:28:22.015 "name": "pt1", 00:28:22.015 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:22.015 "is_configured": true, 00:28:22.015 "data_offset": 256, 00:28:22.015 "data_size": 7936 00:28:22.015 }, 00:28:22.015 { 00:28:22.015 "name": "pt2", 00:28:22.015 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:22.015 "is_configured": true, 00:28:22.015 "data_offset": 256, 00:28:22.015 "data_size": 7936 00:28:22.015 } 00:28:22.015 ] 00:28:22.015 } 00:28:22.015 } 00:28:22.015 }' 00:28:22.015 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:22.015 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:22.015 pt2' 00:28:22.015 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:22.015 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:22.015 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:22.272 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:22.272 "name": "pt1", 00:28:22.272 "aliases": [ 00:28:22.272 "00000000-0000-0000-0000-000000000001" 00:28:22.272 ], 00:28:22.272 "product_name": "passthru", 00:28:22.272 "block_size": 4128, 00:28:22.272 "num_blocks": 8192, 00:28:22.272 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:22.272 "md_size": 32, 00:28:22.272 "md_interleave": true, 00:28:22.272 "dif_type": 0, 00:28:22.272 "assigned_rate_limits": { 00:28:22.272 "rw_ios_per_sec": 0, 00:28:22.272 "rw_mbytes_per_sec": 0, 00:28:22.272 "r_mbytes_per_sec": 0, 00:28:22.272 "w_mbytes_per_sec": 0 00:28:22.272 }, 00:28:22.272 "claimed": true, 00:28:22.272 "claim_type": "exclusive_write", 00:28:22.272 "zoned": false, 00:28:22.272 "supported_io_types": { 00:28:22.272 "read": true, 00:28:22.272 "write": true, 00:28:22.272 "unmap": true, 00:28:22.272 "flush": true, 00:28:22.272 "reset": true, 00:28:22.272 "nvme_admin": false, 00:28:22.272 "nvme_io": false, 00:28:22.272 "nvme_io_md": false, 00:28:22.272 "write_zeroes": true, 00:28:22.272 "zcopy": true, 00:28:22.272 "get_zone_info": false, 00:28:22.272 "zone_management": false, 00:28:22.272 "zone_append": false, 00:28:22.272 "compare": false, 00:28:22.272 "compare_and_write": false, 00:28:22.272 "abort": true, 00:28:22.272 "seek_hole": false, 00:28:22.272 "seek_data": false, 00:28:22.272 "copy": true, 00:28:22.272 "nvme_iov_md": false 00:28:22.272 }, 00:28:22.272 "memory_domains": [ 00:28:22.272 { 00:28:22.272 "dma_device_id": "system", 00:28:22.272 "dma_device_type": 1 00:28:22.272 }, 00:28:22.272 { 00:28:22.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:22.272 "dma_device_type": 2 00:28:22.272 } 00:28:22.272 ], 00:28:22.272 "driver_specific": { 00:28:22.272 "passthru": { 00:28:22.272 "name": "pt1", 00:28:22.272 "base_bdev_name": "malloc1" 00:28:22.272 } 00:28:22.272 } 00:28:22.272 }' 00:28:22.272 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.272 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.272 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:22.272 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.272 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.272 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:22.272 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:22.529 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:22.529 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:22.529 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:22.529 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:22.529 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:22.529 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:22.529 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:22.529 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:22.787 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:22.787 "name": "pt2", 00:28:22.787 "aliases": [ 00:28:22.787 "00000000-0000-0000-0000-000000000002" 00:28:22.787 ], 00:28:22.787 "product_name": "passthru", 00:28:22.787 "block_size": 4128, 00:28:22.787 "num_blocks": 8192, 00:28:22.787 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:22.787 "md_size": 32, 00:28:22.787 "md_interleave": true, 00:28:22.787 "dif_type": 0, 00:28:22.787 "assigned_rate_limits": { 00:28:22.787 "rw_ios_per_sec": 0, 00:28:22.787 "rw_mbytes_per_sec": 0, 00:28:22.787 "r_mbytes_per_sec": 0, 00:28:22.787 "w_mbytes_per_sec": 0 00:28:22.787 }, 00:28:22.787 "claimed": true, 00:28:22.787 "claim_type": "exclusive_write", 00:28:22.787 "zoned": false, 00:28:22.787 "supported_io_types": { 00:28:22.787 "read": true, 00:28:22.787 "write": true, 00:28:22.787 "unmap": true, 00:28:22.787 "flush": true, 00:28:22.787 "reset": true, 00:28:22.787 "nvme_admin": false, 00:28:22.787 "nvme_io": false, 00:28:22.787 "nvme_io_md": false, 00:28:22.787 "write_zeroes": true, 00:28:22.787 "zcopy": true, 00:28:22.787 "get_zone_info": false, 00:28:22.787 "zone_management": false, 00:28:22.787 "zone_append": false, 00:28:22.787 "compare": false, 00:28:22.787 "compare_and_write": false, 00:28:22.787 "abort": true, 00:28:22.787 "seek_hole": false, 00:28:22.787 "seek_data": false, 00:28:22.787 "copy": true, 00:28:22.787 "nvme_iov_md": false 00:28:22.787 }, 00:28:22.787 "memory_domains": [ 00:28:22.787 { 00:28:22.787 "dma_device_id": "system", 00:28:22.787 "dma_device_type": 1 00:28:22.787 }, 00:28:22.787 { 00:28:22.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:22.787 "dma_device_type": 2 00:28:22.787 } 00:28:22.787 ], 00:28:22.787 "driver_specific": { 00:28:22.787 "passthru": { 00:28:22.787 "name": "pt2", 00:28:22.787 "base_bdev_name": "malloc2" 00:28:22.787 } 00:28:22.787 } 00:28:22.787 }' 00:28:22.787 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.787 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.787 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:22.787 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:23.044 10:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:23.044 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:23.044 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:23.044 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:23.044 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:23.044 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:23.044 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:23.044 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:23.044 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:23.044 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:23.302 [2024-07-12 10:54:58.443496] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:23.302 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=343d1f35-3876-487c-829c-762cc5f1b7c2 00:28:23.302 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 343d1f35-3876-487c-829c-762cc5f1b7c2 ']' 00:28:23.302 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:23.560 [2024-07-12 10:54:58.691909] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:23.560 [2024-07-12 10:54:58.691929] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:23.560 [2024-07-12 10:54:58.691983] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:23.560 [2024-07-12 10:54:58.692036] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:23.560 [2024-07-12 10:54:58.692047] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20faf20 name raid_bdev1, state offline 00:28:23.560 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.560 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:23.817 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:23.817 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:23.817 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:23.817 10:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:24.074 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:24.074 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:24.331 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:24.331 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:24.589 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:24.847 [2024-07-12 10:54:59.931140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:24.847 [2024-07-12 10:54:59.932478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:24.847 [2024-07-12 10:54:59.932543] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:24.847 [2024-07-12 10:54:59.932582] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:24.847 [2024-07-12 10:54:59.932601] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:24.847 [2024-07-12 10:54:59.932610] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2105260 name raid_bdev1, state configuring 00:28:24.847 request: 00:28:24.847 { 00:28:24.847 "name": "raid_bdev1", 00:28:24.847 "raid_level": "raid1", 00:28:24.847 "base_bdevs": [ 00:28:24.847 "malloc1", 00:28:24.847 "malloc2" 00:28:24.847 ], 00:28:24.847 "superblock": false, 00:28:24.847 "method": "bdev_raid_create", 00:28:24.847 "req_id": 1 00:28:24.847 } 00:28:24.847 Got JSON-RPC error response 00:28:24.847 response: 00:28:24.847 { 00:28:24.847 "code": -17, 00:28:24.847 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:24.847 } 00:28:24.847 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:28:24.847 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:24.847 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:24.847 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:24.847 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.847 10:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:25.104 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:25.104 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:25.104 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:25.668 [2024-07-12 10:55:00.701113] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:25.668 [2024-07-12 10:55:00.701167] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:25.668 [2024-07-12 10:55:00.701192] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20fc000 00:28:25.668 [2024-07-12 10:55:00.701205] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:25.668 [2024-07-12 10:55:00.702637] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:25.669 [2024-07-12 10:55:00.702665] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:25.669 [2024-07-12 10:55:00.702714] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:25.669 [2024-07-12 10:55:00.702741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:25.669 pt1 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.669 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:25.926 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:25.926 "name": "raid_bdev1", 00:28:25.926 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:25.926 "strip_size_kb": 0, 00:28:25.926 "state": "configuring", 00:28:25.926 "raid_level": "raid1", 00:28:25.926 "superblock": true, 00:28:25.926 "num_base_bdevs": 2, 00:28:25.926 "num_base_bdevs_discovered": 1, 00:28:25.926 "num_base_bdevs_operational": 2, 00:28:25.926 "base_bdevs_list": [ 00:28:25.926 { 00:28:25.926 "name": "pt1", 00:28:25.926 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:25.926 "is_configured": true, 00:28:25.926 "data_offset": 256, 00:28:25.926 "data_size": 7936 00:28:25.926 }, 00:28:25.926 { 00:28:25.926 "name": null, 00:28:25.926 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:25.926 "is_configured": false, 00:28:25.926 "data_offset": 256, 00:28:25.926 "data_size": 7936 00:28:25.926 } 00:28:25.926 ] 00:28:25.926 }' 00:28:25.926 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:25.926 10:55:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:26.490 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:26.490 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:26.490 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:26.490 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:26.490 [2024-07-12 10:55:01.651633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:26.490 [2024-07-12 10:55:01.651682] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:26.490 [2024-07-12 10:55:01.651703] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20fe270 00:28:26.490 [2024-07-12 10:55:01.651716] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:26.490 [2024-07-12 10:55:01.651885] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:26.490 [2024-07-12 10:55:01.651906] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:26.490 [2024-07-12 10:55:01.651950] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:26.490 [2024-07-12 10:55:01.651968] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:26.490 [2024-07-12 10:55:01.652048] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f77c10 00:28:26.491 [2024-07-12 10:55:01.652057] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:26.491 [2024-07-12 10:55:01.652114] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f9d40 00:28:26.491 [2024-07-12 10:55:01.652188] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f77c10 00:28:26.491 [2024-07-12 10:55:01.652198] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f77c10 00:28:26.491 [2024-07-12 10:55:01.652253] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:26.491 pt2 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.491 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.748 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:26.748 "name": "raid_bdev1", 00:28:26.748 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:26.748 "strip_size_kb": 0, 00:28:26.748 "state": "online", 00:28:26.748 "raid_level": "raid1", 00:28:26.748 "superblock": true, 00:28:26.748 "num_base_bdevs": 2, 00:28:26.748 "num_base_bdevs_discovered": 2, 00:28:26.748 "num_base_bdevs_operational": 2, 00:28:26.748 "base_bdevs_list": [ 00:28:26.748 { 00:28:26.748 "name": "pt1", 00:28:26.748 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:26.748 "is_configured": true, 00:28:26.748 "data_offset": 256, 00:28:26.748 "data_size": 7936 00:28:26.748 }, 00:28:26.748 { 00:28:26.748 "name": "pt2", 00:28:26.748 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:26.748 "is_configured": true, 00:28:26.748 "data_offset": 256, 00:28:26.748 "data_size": 7936 00:28:26.748 } 00:28:26.748 ] 00:28:26.748 }' 00:28:26.748 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:26.748 10:55:01 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:27.679 [2024-07-12 10:55:02.742796] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:27.679 "name": "raid_bdev1", 00:28:27.679 "aliases": [ 00:28:27.679 "343d1f35-3876-487c-829c-762cc5f1b7c2" 00:28:27.679 ], 00:28:27.679 "product_name": "Raid Volume", 00:28:27.679 "block_size": 4128, 00:28:27.679 "num_blocks": 7936, 00:28:27.679 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:27.679 "md_size": 32, 00:28:27.679 "md_interleave": true, 00:28:27.679 "dif_type": 0, 00:28:27.679 "assigned_rate_limits": { 00:28:27.679 "rw_ios_per_sec": 0, 00:28:27.679 "rw_mbytes_per_sec": 0, 00:28:27.679 "r_mbytes_per_sec": 0, 00:28:27.679 "w_mbytes_per_sec": 0 00:28:27.679 }, 00:28:27.679 "claimed": false, 00:28:27.679 "zoned": false, 00:28:27.679 "supported_io_types": { 00:28:27.679 "read": true, 00:28:27.679 "write": true, 00:28:27.679 "unmap": false, 00:28:27.679 "flush": false, 00:28:27.679 "reset": true, 00:28:27.679 "nvme_admin": false, 00:28:27.679 "nvme_io": false, 00:28:27.679 "nvme_io_md": false, 00:28:27.679 "write_zeroes": true, 00:28:27.679 "zcopy": false, 00:28:27.679 "get_zone_info": false, 00:28:27.679 "zone_management": false, 00:28:27.679 "zone_append": false, 00:28:27.679 "compare": false, 00:28:27.679 "compare_and_write": false, 00:28:27.679 "abort": false, 00:28:27.679 "seek_hole": false, 00:28:27.679 "seek_data": false, 00:28:27.679 "copy": false, 00:28:27.679 "nvme_iov_md": false 00:28:27.679 }, 00:28:27.679 "memory_domains": [ 00:28:27.679 { 00:28:27.679 "dma_device_id": "system", 00:28:27.679 "dma_device_type": 1 00:28:27.679 }, 00:28:27.679 { 00:28:27.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:27.679 "dma_device_type": 2 00:28:27.679 }, 00:28:27.679 { 00:28:27.679 "dma_device_id": "system", 00:28:27.679 "dma_device_type": 1 00:28:27.679 }, 00:28:27.679 { 00:28:27.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:27.679 "dma_device_type": 2 00:28:27.679 } 00:28:27.679 ], 00:28:27.679 "driver_specific": { 00:28:27.679 "raid": { 00:28:27.679 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:27.679 "strip_size_kb": 0, 00:28:27.679 "state": "online", 00:28:27.679 "raid_level": "raid1", 00:28:27.679 "superblock": true, 00:28:27.679 "num_base_bdevs": 2, 00:28:27.679 "num_base_bdevs_discovered": 2, 00:28:27.679 "num_base_bdevs_operational": 2, 00:28:27.679 "base_bdevs_list": [ 00:28:27.679 { 00:28:27.679 "name": "pt1", 00:28:27.679 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:27.679 "is_configured": true, 00:28:27.679 "data_offset": 256, 00:28:27.679 "data_size": 7936 00:28:27.679 }, 00:28:27.679 { 00:28:27.679 "name": "pt2", 00:28:27.679 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:27.679 "is_configured": true, 00:28:27.679 "data_offset": 256, 00:28:27.679 "data_size": 7936 00:28:27.679 } 00:28:27.679 ] 00:28:27.679 } 00:28:27.679 } 00:28:27.679 }' 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:27.679 pt2' 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:27.679 10:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:27.936 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:27.936 "name": "pt1", 00:28:27.936 "aliases": [ 00:28:27.936 "00000000-0000-0000-0000-000000000001" 00:28:27.936 ], 00:28:27.936 "product_name": "passthru", 00:28:27.936 "block_size": 4128, 00:28:27.936 "num_blocks": 8192, 00:28:27.936 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:27.936 "md_size": 32, 00:28:27.936 "md_interleave": true, 00:28:27.936 "dif_type": 0, 00:28:27.936 "assigned_rate_limits": { 00:28:27.936 "rw_ios_per_sec": 0, 00:28:27.936 "rw_mbytes_per_sec": 0, 00:28:27.936 "r_mbytes_per_sec": 0, 00:28:27.936 "w_mbytes_per_sec": 0 00:28:27.936 }, 00:28:27.936 "claimed": true, 00:28:27.936 "claim_type": "exclusive_write", 00:28:27.936 "zoned": false, 00:28:27.936 "supported_io_types": { 00:28:27.936 "read": true, 00:28:27.936 "write": true, 00:28:27.936 "unmap": true, 00:28:27.936 "flush": true, 00:28:27.936 "reset": true, 00:28:27.936 "nvme_admin": false, 00:28:27.936 "nvme_io": false, 00:28:27.936 "nvme_io_md": false, 00:28:27.936 "write_zeroes": true, 00:28:27.936 "zcopy": true, 00:28:27.936 "get_zone_info": false, 00:28:27.936 "zone_management": false, 00:28:27.936 "zone_append": false, 00:28:27.936 "compare": false, 00:28:27.936 "compare_and_write": false, 00:28:27.936 "abort": true, 00:28:27.936 "seek_hole": false, 00:28:27.936 "seek_data": false, 00:28:27.936 "copy": true, 00:28:27.936 "nvme_iov_md": false 00:28:27.936 }, 00:28:27.936 "memory_domains": [ 00:28:27.936 { 00:28:27.936 "dma_device_id": "system", 00:28:27.936 "dma_device_type": 1 00:28:27.936 }, 00:28:27.936 { 00:28:27.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:27.936 "dma_device_type": 2 00:28:27.936 } 00:28:27.936 ], 00:28:27.936 "driver_specific": { 00:28:27.936 "passthru": { 00:28:27.936 "name": "pt1", 00:28:27.936 "base_bdev_name": "malloc1" 00:28:27.936 } 00:28:27.936 } 00:28:27.936 }' 00:28:27.936 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:27.936 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:28.193 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:28.193 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:28.193 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:28.193 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:28.193 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:28.193 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:28.193 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:28.193 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:28.193 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:28.450 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:28.450 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:28.450 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:28.450 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:28.708 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:28.708 "name": "pt2", 00:28:28.708 "aliases": [ 00:28:28.708 "00000000-0000-0000-0000-000000000002" 00:28:28.708 ], 00:28:28.708 "product_name": "passthru", 00:28:28.708 "block_size": 4128, 00:28:28.708 "num_blocks": 8192, 00:28:28.708 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:28.708 "md_size": 32, 00:28:28.708 "md_interleave": true, 00:28:28.708 "dif_type": 0, 00:28:28.708 "assigned_rate_limits": { 00:28:28.708 "rw_ios_per_sec": 0, 00:28:28.708 "rw_mbytes_per_sec": 0, 00:28:28.708 "r_mbytes_per_sec": 0, 00:28:28.708 "w_mbytes_per_sec": 0 00:28:28.708 }, 00:28:28.708 "claimed": true, 00:28:28.708 "claim_type": "exclusive_write", 00:28:28.708 "zoned": false, 00:28:28.708 "supported_io_types": { 00:28:28.708 "read": true, 00:28:28.708 "write": true, 00:28:28.708 "unmap": true, 00:28:28.708 "flush": true, 00:28:28.708 "reset": true, 00:28:28.708 "nvme_admin": false, 00:28:28.708 "nvme_io": false, 00:28:28.708 "nvme_io_md": false, 00:28:28.708 "write_zeroes": true, 00:28:28.708 "zcopy": true, 00:28:28.708 "get_zone_info": false, 00:28:28.708 "zone_management": false, 00:28:28.708 "zone_append": false, 00:28:28.708 "compare": false, 00:28:28.708 "compare_and_write": false, 00:28:28.708 "abort": true, 00:28:28.708 "seek_hole": false, 00:28:28.708 "seek_data": false, 00:28:28.708 "copy": true, 00:28:28.708 "nvme_iov_md": false 00:28:28.708 }, 00:28:28.708 "memory_domains": [ 00:28:28.708 { 00:28:28.708 "dma_device_id": "system", 00:28:28.708 "dma_device_type": 1 00:28:28.708 }, 00:28:28.708 { 00:28:28.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:28.708 "dma_device_type": 2 00:28:28.708 } 00:28:28.708 ], 00:28:28.708 "driver_specific": { 00:28:28.708 "passthru": { 00:28:28.708 "name": "pt2", 00:28:28.708 "base_bdev_name": "malloc2" 00:28:28.708 } 00:28:28.708 } 00:28:28.708 }' 00:28:28.708 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:28.708 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:28.708 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:28.708 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:28.708 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:28.708 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:28.708 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:28.708 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:28.965 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:28.965 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:28.965 10:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:28.965 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:28.965 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:28.966 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:29.223 [2024-07-12 10:55:04.258830] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:29.223 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 343d1f35-3876-487c-829c-762cc5f1b7c2 '!=' 343d1f35-3876-487c-829c-762cc5f1b7c2 ']' 00:28:29.223 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:29.223 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:29.223 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:29.223 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:29.480 [2024-07-12 10:55:04.503220] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.480 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.737 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:29.737 "name": "raid_bdev1", 00:28:29.737 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:29.737 "strip_size_kb": 0, 00:28:29.737 "state": "online", 00:28:29.737 "raid_level": "raid1", 00:28:29.737 "superblock": true, 00:28:29.737 "num_base_bdevs": 2, 00:28:29.737 "num_base_bdevs_discovered": 1, 00:28:29.737 "num_base_bdevs_operational": 1, 00:28:29.737 "base_bdevs_list": [ 00:28:29.737 { 00:28:29.737 "name": null, 00:28:29.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:29.737 "is_configured": false, 00:28:29.737 "data_offset": 256, 00:28:29.737 "data_size": 7936 00:28:29.737 }, 00:28:29.737 { 00:28:29.737 "name": "pt2", 00:28:29.737 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:29.737 "is_configured": true, 00:28:29.737 "data_offset": 256, 00:28:29.737 "data_size": 7936 00:28:29.737 } 00:28:29.737 ] 00:28:29.737 }' 00:28:29.737 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:29.737 10:55:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:30.301 10:55:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:30.558 [2024-07-12 10:55:05.517888] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:30.558 [2024-07-12 10:55:05.517915] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:30.558 [2024-07-12 10:55:05.517972] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:30.558 [2024-07-12 10:55:05.518018] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:30.558 [2024-07-12 10:55:05.518030] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f77c10 name raid_bdev1, state offline 00:28:30.558 10:55:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.558 10:55:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:30.816 10:55:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:30.816 10:55:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:30.816 10:55:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:30.816 10:55:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:30.816 10:55:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:31.073 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:31.073 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:31.074 [2024-07-12 10:55:06.243775] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:31.074 [2024-07-12 10:55:06.243826] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:31.074 [2024-07-12 10:55:06.243846] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20fc9f0 00:28:31.074 [2024-07-12 10:55:06.243858] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:31.074 [2024-07-12 10:55:06.245339] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:31.074 [2024-07-12 10:55:06.245369] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:31.074 [2024-07-12 10:55:06.245417] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:31.074 [2024-07-12 10:55:06.245444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:31.074 [2024-07-12 10:55:06.245523] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20fdea0 00:28:31.074 [2024-07-12 10:55:06.245533] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:31.074 [2024-07-12 10:55:06.245598] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20fbbc0 00:28:31.074 [2024-07-12 10:55:06.245669] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20fdea0 00:28:31.074 [2024-07-12 10:55:06.245685] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20fdea0 00:28:31.074 [2024-07-12 10:55:06.245741] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:31.074 pt2 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.074 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.331 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.331 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.331 "name": "raid_bdev1", 00:28:31.331 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:31.331 "strip_size_kb": 0, 00:28:31.331 "state": "online", 00:28:31.331 "raid_level": "raid1", 00:28:31.331 "superblock": true, 00:28:31.331 "num_base_bdevs": 2, 00:28:31.331 "num_base_bdevs_discovered": 1, 00:28:31.331 "num_base_bdevs_operational": 1, 00:28:31.331 "base_bdevs_list": [ 00:28:31.331 { 00:28:31.331 "name": null, 00:28:31.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.331 "is_configured": false, 00:28:31.331 "data_offset": 256, 00:28:31.331 "data_size": 7936 00:28:31.331 }, 00:28:31.331 { 00:28:31.331 "name": "pt2", 00:28:31.331 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:31.331 "is_configured": true, 00:28:31.331 "data_offset": 256, 00:28:31.331 "data_size": 7936 00:28:31.331 } 00:28:31.331 ] 00:28:31.331 }' 00:28:31.331 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.331 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:31.896 10:55:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:32.153 [2024-07-12 10:55:07.214339] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:32.154 [2024-07-12 10:55:07.214366] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:32.154 [2024-07-12 10:55:07.214420] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:32.154 [2024-07-12 10:55:07.214461] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:32.154 [2024-07-12 10:55:07.214472] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20fdea0 name raid_bdev1, state offline 00:28:32.154 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.154 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:32.412 [2024-07-12 10:55:07.543201] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:32.412 [2024-07-12 10:55:07.543245] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:32.412 [2024-07-12 10:55:07.543264] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20fc620 00:28:32.412 [2024-07-12 10:55:07.543277] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:32.412 [2024-07-12 10:55:07.544713] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:32.412 [2024-07-12 10:55:07.544743] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:32.412 [2024-07-12 10:55:07.544794] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:32.412 [2024-07-12 10:55:07.544818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:32.412 [2024-07-12 10:55:07.544897] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:32.412 [2024-07-12 10:55:07.544910] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:32.412 [2024-07-12 10:55:07.544925] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20fe640 name raid_bdev1, state configuring 00:28:32.412 [2024-07-12 10:55:07.544948] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:32.412 [2024-07-12 10:55:07.544997] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20fe640 00:28:32.412 [2024-07-12 10:55:07.545008] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:32.412 [2024-07-12 10:55:07.545067] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20fd810 00:28:32.412 [2024-07-12 10:55:07.545138] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20fe640 00:28:32.412 [2024-07-12 10:55:07.545147] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20fe640 00:28:32.412 [2024-07-12 10:55:07.545205] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:32.412 pt1 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.412 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:32.669 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:32.670 "name": "raid_bdev1", 00:28:32.670 "uuid": "343d1f35-3876-487c-829c-762cc5f1b7c2", 00:28:32.670 "strip_size_kb": 0, 00:28:32.670 "state": "online", 00:28:32.670 "raid_level": "raid1", 00:28:32.670 "superblock": true, 00:28:32.670 "num_base_bdevs": 2, 00:28:32.670 "num_base_bdevs_discovered": 1, 00:28:32.670 "num_base_bdevs_operational": 1, 00:28:32.670 "base_bdevs_list": [ 00:28:32.670 { 00:28:32.670 "name": null, 00:28:32.670 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:32.670 "is_configured": false, 00:28:32.670 "data_offset": 256, 00:28:32.670 "data_size": 7936 00:28:32.670 }, 00:28:32.670 { 00:28:32.670 "name": "pt2", 00:28:32.670 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:32.670 "is_configured": true, 00:28:32.670 "data_offset": 256, 00:28:32.670 "data_size": 7936 00:28:32.670 } 00:28:32.670 ] 00:28:32.670 }' 00:28:32.670 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:32.670 10:55:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:33.301 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:33.301 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:33.558 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:33.558 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:33.558 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:33.814 [2024-07-12 10:55:08.778708] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 343d1f35-3876-487c-829c-762cc5f1b7c2 '!=' 343d1f35-3876-487c-829c-762cc5f1b7c2 ']' 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 2172036 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2172036 ']' 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2172036 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2172036 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2172036' 00:28:33.814 killing process with pid 2172036 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 2172036 00:28:33.814 [2024-07-12 10:55:08.834360] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:33.814 10:55:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 2172036 00:28:33.814 [2024-07-12 10:55:08.834416] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:33.814 [2024-07-12 10:55:08.834461] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:33.814 [2024-07-12 10:55:08.834473] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20fe640 name raid_bdev1, state offline 00:28:33.815 [2024-07-12 10:55:08.851061] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:34.072 10:55:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:28:34.072 00:28:34.072 real 0m15.311s 00:28:34.072 user 0m27.806s 00:28:34.072 sys 0m2.812s 00:28:34.072 10:55:09 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:34.072 10:55:09 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:34.072 ************************************ 00:28:34.072 END TEST raid_superblock_test_md_interleaved 00:28:34.072 ************************************ 00:28:34.072 10:55:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:34.072 10:55:09 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:28:34.072 10:55:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:34.072 10:55:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:34.072 10:55:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:34.072 ************************************ 00:28:34.072 START TEST raid_rebuild_test_sb_md_interleaved 00:28:34.072 ************************************ 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=2174286 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 2174286 /var/tmp/spdk-raid.sock 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2174286 ']' 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:34.072 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:34.072 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:34.072 [2024-07-12 10:55:09.176161] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:28:34.072 [2024-07-12 10:55:09.176205] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2174286 ] 00:28:34.072 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:34.072 Zero copy mechanism will not be used. 00:28:34.330 [2024-07-12 10:55:09.286861] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:34.330 [2024-07-12 10:55:09.394654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:34.330 [2024-07-12 10:55:09.449003] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:34.330 [2024-07-12 10:55:09.449033] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:34.587 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:34.587 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:34.587 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:34.587 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:28:34.845 BaseBdev1_malloc 00:28:34.845 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:34.845 [2024-07-12 10:55:09.974409] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:34.845 [2024-07-12 10:55:09.974456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:34.845 [2024-07-12 10:55:09.974490] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1112ce0 00:28:34.845 [2024-07-12 10:55:09.974504] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:34.845 [2024-07-12 10:55:09.976052] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:34.845 [2024-07-12 10:55:09.976081] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:34.845 BaseBdev1 00:28:34.845 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:34.845 10:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:28:35.102 BaseBdev2_malloc 00:28:35.102 10:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:35.360 [2024-07-12 10:55:10.385975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:35.360 [2024-07-12 10:55:10.386022] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:35.360 [2024-07-12 10:55:10.386045] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110a2d0 00:28:35.360 [2024-07-12 10:55:10.386057] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:35.360 [2024-07-12 10:55:10.387826] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:35.360 [2024-07-12 10:55:10.387854] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:35.360 BaseBdev2 00:28:35.360 10:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:28:35.618 spare_malloc 00:28:35.618 10:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:35.618 spare_delay 00:28:35.618 10:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:35.875 [2024-07-12 10:55:10.953590] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:35.876 [2024-07-12 10:55:10.953632] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:35.876 [2024-07-12 10:55:10.953654] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110d070 00:28:35.876 [2024-07-12 10:55:10.953666] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:35.876 [2024-07-12 10:55:10.955041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:35.876 [2024-07-12 10:55:10.955067] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:35.876 spare 00:28:35.876 10:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:36.133 [2024-07-12 10:55:11.114039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:36.133 [2024-07-12 10:55:11.115316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:36.133 [2024-07-12 10:55:11.115489] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x110f370 00:28:36.133 [2024-07-12 10:55:11.115503] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:36.133 [2024-07-12 10:55:11.115573] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf759c0 00:28:36.133 [2024-07-12 10:55:11.115657] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x110f370 00:28:36.133 [2024-07-12 10:55:11.115667] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x110f370 00:28:36.133 [2024-07-12 10:55:11.115722] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.133 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:36.699 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:36.699 "name": "raid_bdev1", 00:28:36.699 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:36.699 "strip_size_kb": 0, 00:28:36.699 "state": "online", 00:28:36.699 "raid_level": "raid1", 00:28:36.699 "superblock": true, 00:28:36.699 "num_base_bdevs": 2, 00:28:36.699 "num_base_bdevs_discovered": 2, 00:28:36.699 "num_base_bdevs_operational": 2, 00:28:36.699 "base_bdevs_list": [ 00:28:36.699 { 00:28:36.699 "name": "BaseBdev1", 00:28:36.699 "uuid": "9508523c-c06a-55f8-a167-5b9d1e5c6f4e", 00:28:36.699 "is_configured": true, 00:28:36.699 "data_offset": 256, 00:28:36.699 "data_size": 7936 00:28:36.699 }, 00:28:36.699 { 00:28:36.699 "name": "BaseBdev2", 00:28:36.699 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:36.699 "is_configured": true, 00:28:36.699 "data_offset": 256, 00:28:36.699 "data_size": 7936 00:28:36.699 } 00:28:36.699 ] 00:28:36.699 }' 00:28:36.699 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:36.699 10:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:37.263 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:37.263 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:37.521 [2024-07-12 10:55:12.461857] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:37.521 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:37.521 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.521 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:37.779 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:37.780 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:37.780 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:28:37.780 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:37.780 [2024-07-12 10:55:12.954898] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:37.780 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:37.780 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:38.041 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:38.041 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:38.041 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:38.041 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:38.041 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:38.041 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:38.041 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:38.041 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.042 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.042 10:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.042 10:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.042 "name": "raid_bdev1", 00:28:38.042 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:38.042 "strip_size_kb": 0, 00:28:38.042 "state": "online", 00:28:38.042 "raid_level": "raid1", 00:28:38.042 "superblock": true, 00:28:38.042 "num_base_bdevs": 2, 00:28:38.042 "num_base_bdevs_discovered": 1, 00:28:38.042 "num_base_bdevs_operational": 1, 00:28:38.042 "base_bdevs_list": [ 00:28:38.042 { 00:28:38.042 "name": null, 00:28:38.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:38.042 "is_configured": false, 00:28:38.042 "data_offset": 256, 00:28:38.042 "data_size": 7936 00:28:38.042 }, 00:28:38.042 { 00:28:38.042 "name": "BaseBdev2", 00:28:38.042 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:38.042 "is_configured": true, 00:28:38.042 "data_offset": 256, 00:28:38.042 "data_size": 7936 00:28:38.042 } 00:28:38.042 ] 00:28:38.042 }' 00:28:38.042 10:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.042 10:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:38.607 10:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:38.865 [2024-07-12 10:55:13.909440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:38.865 [2024-07-12 10:55:13.913050] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x110f250 00:28:38.865 [2024-07-12 10:55:13.915057] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:38.865 10:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:39.799 10:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:39.799 10:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:39.799 10:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:39.799 10:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:39.799 10:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:39.799 10:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:39.800 10:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.057 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:40.057 "name": "raid_bdev1", 00:28:40.057 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:40.057 "strip_size_kb": 0, 00:28:40.057 "state": "online", 00:28:40.057 "raid_level": "raid1", 00:28:40.057 "superblock": true, 00:28:40.057 "num_base_bdevs": 2, 00:28:40.057 "num_base_bdevs_discovered": 2, 00:28:40.057 "num_base_bdevs_operational": 2, 00:28:40.057 "process": { 00:28:40.057 "type": "rebuild", 00:28:40.057 "target": "spare", 00:28:40.057 "progress": { 00:28:40.057 "blocks": 3072, 00:28:40.057 "percent": 38 00:28:40.057 } 00:28:40.057 }, 00:28:40.057 "base_bdevs_list": [ 00:28:40.057 { 00:28:40.057 "name": "spare", 00:28:40.057 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:40.057 "is_configured": true, 00:28:40.057 "data_offset": 256, 00:28:40.057 "data_size": 7936 00:28:40.057 }, 00:28:40.057 { 00:28:40.057 "name": "BaseBdev2", 00:28:40.057 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:40.057 "is_configured": true, 00:28:40.057 "data_offset": 256, 00:28:40.057 "data_size": 7936 00:28:40.057 } 00:28:40.057 ] 00:28:40.057 }' 00:28:40.057 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:40.057 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:40.057 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:40.315 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:40.315 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:40.315 [2024-07-12 10:55:15.479868] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:40.573 [2024-07-12 10:55:15.527703] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:40.573 [2024-07-12 10:55:15.527748] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:40.573 [2024-07-12 10:55:15.527763] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:40.573 [2024-07-12 10:55:15.527777] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.573 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.831 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:40.831 "name": "raid_bdev1", 00:28:40.831 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:40.831 "strip_size_kb": 0, 00:28:40.831 "state": "online", 00:28:40.831 "raid_level": "raid1", 00:28:40.831 "superblock": true, 00:28:40.831 "num_base_bdevs": 2, 00:28:40.831 "num_base_bdevs_discovered": 1, 00:28:40.831 "num_base_bdevs_operational": 1, 00:28:40.831 "base_bdevs_list": [ 00:28:40.831 { 00:28:40.831 "name": null, 00:28:40.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:40.831 "is_configured": false, 00:28:40.831 "data_offset": 256, 00:28:40.831 "data_size": 7936 00:28:40.831 }, 00:28:40.831 { 00:28:40.831 "name": "BaseBdev2", 00:28:40.831 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:40.831 "is_configured": true, 00:28:40.831 "data_offset": 256, 00:28:40.831 "data_size": 7936 00:28:40.831 } 00:28:40.831 ] 00:28:40.831 }' 00:28:40.831 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:40.831 10:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:41.397 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:41.397 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:41.397 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:41.397 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:41.397 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:41.397 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.397 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.654 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:41.654 "name": "raid_bdev1", 00:28:41.654 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:41.654 "strip_size_kb": 0, 00:28:41.654 "state": "online", 00:28:41.654 "raid_level": "raid1", 00:28:41.654 "superblock": true, 00:28:41.654 "num_base_bdevs": 2, 00:28:41.654 "num_base_bdevs_discovered": 1, 00:28:41.654 "num_base_bdevs_operational": 1, 00:28:41.654 "base_bdevs_list": [ 00:28:41.654 { 00:28:41.654 "name": null, 00:28:41.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.654 "is_configured": false, 00:28:41.654 "data_offset": 256, 00:28:41.654 "data_size": 7936 00:28:41.654 }, 00:28:41.654 { 00:28:41.654 "name": "BaseBdev2", 00:28:41.654 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:41.654 "is_configured": true, 00:28:41.654 "data_offset": 256, 00:28:41.654 "data_size": 7936 00:28:41.654 } 00:28:41.654 ] 00:28:41.654 }' 00:28:41.654 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:41.654 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:41.654 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:41.654 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:41.654 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:41.912 [2024-07-12 10:55:16.967353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:41.912 [2024-07-12 10:55:16.971456] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x110b270 00:28:41.912 [2024-07-12 10:55:16.972948] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:41.912 10:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:42.859 10:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:42.859 10:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:42.859 10:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:42.859 10:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:42.859 10:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:42.859 10:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.859 10:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.115 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:43.115 "name": "raid_bdev1", 00:28:43.115 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:43.115 "strip_size_kb": 0, 00:28:43.115 "state": "online", 00:28:43.115 "raid_level": "raid1", 00:28:43.115 "superblock": true, 00:28:43.115 "num_base_bdevs": 2, 00:28:43.115 "num_base_bdevs_discovered": 2, 00:28:43.115 "num_base_bdevs_operational": 2, 00:28:43.115 "process": { 00:28:43.115 "type": "rebuild", 00:28:43.115 "target": "spare", 00:28:43.115 "progress": { 00:28:43.115 "blocks": 3072, 00:28:43.115 "percent": 38 00:28:43.115 } 00:28:43.115 }, 00:28:43.115 "base_bdevs_list": [ 00:28:43.115 { 00:28:43.115 "name": "spare", 00:28:43.115 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:43.115 "is_configured": true, 00:28:43.115 "data_offset": 256, 00:28:43.115 "data_size": 7936 00:28:43.116 }, 00:28:43.116 { 00:28:43.116 "name": "BaseBdev2", 00:28:43.116 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:43.116 "is_configured": true, 00:28:43.116 "data_offset": 256, 00:28:43.116 "data_size": 7936 00:28:43.116 } 00:28:43.116 ] 00:28:43.116 }' 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:43.116 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1114 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.116 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.678 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:43.678 "name": "raid_bdev1", 00:28:43.678 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:43.678 "strip_size_kb": 0, 00:28:43.678 "state": "online", 00:28:43.678 "raid_level": "raid1", 00:28:43.678 "superblock": true, 00:28:43.678 "num_base_bdevs": 2, 00:28:43.678 "num_base_bdevs_discovered": 2, 00:28:43.678 "num_base_bdevs_operational": 2, 00:28:43.678 "process": { 00:28:43.678 "type": "rebuild", 00:28:43.678 "target": "spare", 00:28:43.678 "progress": { 00:28:43.678 "blocks": 4352, 00:28:43.678 "percent": 54 00:28:43.678 } 00:28:43.678 }, 00:28:43.678 "base_bdevs_list": [ 00:28:43.678 { 00:28:43.678 "name": "spare", 00:28:43.678 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:43.678 "is_configured": true, 00:28:43.678 "data_offset": 256, 00:28:43.678 "data_size": 7936 00:28:43.678 }, 00:28:43.678 { 00:28:43.678 "name": "BaseBdev2", 00:28:43.678 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:43.678 "is_configured": true, 00:28:43.678 "data_offset": 256, 00:28:43.678 "data_size": 7936 00:28:43.678 } 00:28:43.678 ] 00:28:43.678 }' 00:28:43.678 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:43.678 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:43.678 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:43.933 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:43.933 10:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:44.860 10:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:44.860 10:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:44.860 10:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:44.860 10:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:44.860 10:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:44.860 10:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:44.860 10:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.860 10:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.116 [2024-07-12 10:55:20.097144] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:45.116 [2024-07-12 10:55:20.097205] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:45.116 [2024-07-12 10:55:20.097299] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:45.116 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:45.116 "name": "raid_bdev1", 00:28:45.116 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:45.116 "strip_size_kb": 0, 00:28:45.116 "state": "online", 00:28:45.116 "raid_level": "raid1", 00:28:45.116 "superblock": true, 00:28:45.116 "num_base_bdevs": 2, 00:28:45.116 "num_base_bdevs_discovered": 2, 00:28:45.116 "num_base_bdevs_operational": 2, 00:28:45.116 "base_bdevs_list": [ 00:28:45.116 { 00:28:45.116 "name": "spare", 00:28:45.116 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:45.116 "is_configured": true, 00:28:45.116 "data_offset": 256, 00:28:45.116 "data_size": 7936 00:28:45.116 }, 00:28:45.116 { 00:28:45.116 "name": "BaseBdev2", 00:28:45.116 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:45.116 "is_configured": true, 00:28:45.116 "data_offset": 256, 00:28:45.116 "data_size": 7936 00:28:45.116 } 00:28:45.116 ] 00:28:45.116 }' 00:28:45.116 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:45.116 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:45.116 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:45.116 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:45.116 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:28:45.116 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:45.117 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:45.117 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:45.117 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:45.117 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:45.117 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.117 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.373 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:45.373 "name": "raid_bdev1", 00:28:45.373 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:45.373 "strip_size_kb": 0, 00:28:45.373 "state": "online", 00:28:45.373 "raid_level": "raid1", 00:28:45.373 "superblock": true, 00:28:45.373 "num_base_bdevs": 2, 00:28:45.373 "num_base_bdevs_discovered": 2, 00:28:45.373 "num_base_bdevs_operational": 2, 00:28:45.373 "base_bdevs_list": [ 00:28:45.373 { 00:28:45.373 "name": "spare", 00:28:45.373 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:45.373 "is_configured": true, 00:28:45.373 "data_offset": 256, 00:28:45.373 "data_size": 7936 00:28:45.373 }, 00:28:45.373 { 00:28:45.373 "name": "BaseBdev2", 00:28:45.373 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:45.373 "is_configured": true, 00:28:45.373 "data_offset": 256, 00:28:45.373 "data_size": 7936 00:28:45.373 } 00:28:45.373 ] 00:28:45.373 }' 00:28:45.373 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:45.373 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:45.373 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.630 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.886 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:45.886 "name": "raid_bdev1", 00:28:45.886 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:45.886 "strip_size_kb": 0, 00:28:45.886 "state": "online", 00:28:45.886 "raid_level": "raid1", 00:28:45.886 "superblock": true, 00:28:45.886 "num_base_bdevs": 2, 00:28:45.886 "num_base_bdevs_discovered": 2, 00:28:45.886 "num_base_bdevs_operational": 2, 00:28:45.886 "base_bdevs_list": [ 00:28:45.886 { 00:28:45.886 "name": "spare", 00:28:45.886 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:45.886 "is_configured": true, 00:28:45.886 "data_offset": 256, 00:28:45.886 "data_size": 7936 00:28:45.886 }, 00:28:45.886 { 00:28:45.886 "name": "BaseBdev2", 00:28:45.886 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:45.886 "is_configured": true, 00:28:45.886 "data_offset": 256, 00:28:45.886 "data_size": 7936 00:28:45.886 } 00:28:45.886 ] 00:28:45.886 }' 00:28:45.886 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:45.886 10:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:46.449 10:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:46.706 [2024-07-12 10:55:21.645597] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:46.706 [2024-07-12 10:55:21.645633] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:46.706 [2024-07-12 10:55:21.645687] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:46.706 [2024-07-12 10:55:21.645742] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:46.706 [2024-07-12 10:55:21.645754] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x110f370 name raid_bdev1, state offline 00:28:46.706 10:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.706 10:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:28:46.963 10:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:46.963 10:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:28:46.963 10:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:46.963 10:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:47.220 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:47.220 [2024-07-12 10:55:22.387525] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:47.220 [2024-07-12 10:55:22.387568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:47.220 [2024-07-12 10:55:22.387588] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1111730 00:28:47.220 [2024-07-12 10:55:22.387601] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:47.220 [2024-07-12 10:55:22.389041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:47.220 [2024-07-12 10:55:22.389069] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:47.220 [2024-07-12 10:55:22.389121] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:47.220 [2024-07-12 10:55:22.389145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:47.220 [2024-07-12 10:55:22.389236] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:47.220 spare 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.478 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.478 [2024-07-12 10:55:22.489542] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x110f810 00:28:47.478 [2024-07-12 10:55:22.489560] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:47.478 [2024-07-12 10:55:22.489637] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x110aa60 00:28:47.478 [2024-07-12 10:55:22.489730] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x110f810 00:28:47.478 [2024-07-12 10:55:22.489740] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x110f810 00:28:47.478 [2024-07-12 10:55:22.489804] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:47.735 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:47.735 "name": "raid_bdev1", 00:28:47.735 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:47.735 "strip_size_kb": 0, 00:28:47.735 "state": "online", 00:28:47.735 "raid_level": "raid1", 00:28:47.735 "superblock": true, 00:28:47.735 "num_base_bdevs": 2, 00:28:47.735 "num_base_bdevs_discovered": 2, 00:28:47.735 "num_base_bdevs_operational": 2, 00:28:47.735 "base_bdevs_list": [ 00:28:47.735 { 00:28:47.735 "name": "spare", 00:28:47.735 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:47.735 "is_configured": true, 00:28:47.735 "data_offset": 256, 00:28:47.735 "data_size": 7936 00:28:47.735 }, 00:28:47.735 { 00:28:47.735 "name": "BaseBdev2", 00:28:47.735 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:47.735 "is_configured": true, 00:28:47.735 "data_offset": 256, 00:28:47.735 "data_size": 7936 00:28:47.735 } 00:28:47.735 ] 00:28:47.735 }' 00:28:47.735 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:47.735 10:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:48.302 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:48.302 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:48.302 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:48.302 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:48.302 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:48.302 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.302 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.302 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:48.302 "name": "raid_bdev1", 00:28:48.302 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:48.302 "strip_size_kb": 0, 00:28:48.302 "state": "online", 00:28:48.302 "raid_level": "raid1", 00:28:48.302 "superblock": true, 00:28:48.302 "num_base_bdevs": 2, 00:28:48.302 "num_base_bdevs_discovered": 2, 00:28:48.302 "num_base_bdevs_operational": 2, 00:28:48.302 "base_bdevs_list": [ 00:28:48.302 { 00:28:48.302 "name": "spare", 00:28:48.302 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:48.302 "is_configured": true, 00:28:48.302 "data_offset": 256, 00:28:48.302 "data_size": 7936 00:28:48.302 }, 00:28:48.302 { 00:28:48.302 "name": "BaseBdev2", 00:28:48.302 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:48.302 "is_configured": true, 00:28:48.302 "data_offset": 256, 00:28:48.302 "data_size": 7936 00:28:48.302 } 00:28:48.302 ] 00:28:48.302 }' 00:28:48.302 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:48.559 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:48.559 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:48.559 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:48.559 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.559 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:48.816 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:48.816 10:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:49.072 [2024-07-12 10:55:24.052019] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:49.072 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:49.072 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:49.072 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:49.072 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:49.072 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:49.072 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:49.072 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:49.072 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:49.073 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:49.073 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:49.073 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.073 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:49.330 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:49.330 "name": "raid_bdev1", 00:28:49.330 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:49.330 "strip_size_kb": 0, 00:28:49.330 "state": "online", 00:28:49.330 "raid_level": "raid1", 00:28:49.330 "superblock": true, 00:28:49.330 "num_base_bdevs": 2, 00:28:49.330 "num_base_bdevs_discovered": 1, 00:28:49.330 "num_base_bdevs_operational": 1, 00:28:49.330 "base_bdevs_list": [ 00:28:49.330 { 00:28:49.330 "name": null, 00:28:49.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:49.330 "is_configured": false, 00:28:49.330 "data_offset": 256, 00:28:49.330 "data_size": 7936 00:28:49.330 }, 00:28:49.330 { 00:28:49.330 "name": "BaseBdev2", 00:28:49.330 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:49.330 "is_configured": true, 00:28:49.330 "data_offset": 256, 00:28:49.330 "data_size": 7936 00:28:49.330 } 00:28:49.330 ] 00:28:49.330 }' 00:28:49.330 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:49.330 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:49.906 10:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:50.196 [2024-07-12 10:55:25.134898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:50.196 [2024-07-12 10:55:25.135042] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:50.196 [2024-07-12 10:55:25.135059] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:50.196 [2024-07-12 10:55:25.135086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:50.196 [2024-07-12 10:55:25.138554] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x110fd90 00:28:50.196 [2024-07-12 10:55:25.139961] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:50.196 10:55:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:51.128 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:51.128 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:51.128 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:51.128 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:51.128 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:51.128 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.128 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.386 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:51.386 "name": "raid_bdev1", 00:28:51.386 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:51.386 "strip_size_kb": 0, 00:28:51.386 "state": "online", 00:28:51.386 "raid_level": "raid1", 00:28:51.386 "superblock": true, 00:28:51.386 "num_base_bdevs": 2, 00:28:51.386 "num_base_bdevs_discovered": 2, 00:28:51.386 "num_base_bdevs_operational": 2, 00:28:51.386 "process": { 00:28:51.386 "type": "rebuild", 00:28:51.386 "target": "spare", 00:28:51.386 "progress": { 00:28:51.386 "blocks": 3072, 00:28:51.386 "percent": 38 00:28:51.386 } 00:28:51.386 }, 00:28:51.386 "base_bdevs_list": [ 00:28:51.386 { 00:28:51.386 "name": "spare", 00:28:51.386 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:51.386 "is_configured": true, 00:28:51.386 "data_offset": 256, 00:28:51.386 "data_size": 7936 00:28:51.386 }, 00:28:51.386 { 00:28:51.386 "name": "BaseBdev2", 00:28:51.386 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:51.386 "is_configured": true, 00:28:51.386 "data_offset": 256, 00:28:51.386 "data_size": 7936 00:28:51.386 } 00:28:51.386 ] 00:28:51.386 }' 00:28:51.386 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:51.386 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:51.386 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:51.386 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:51.386 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:51.644 [2024-07-12 10:55:26.729216] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:51.644 [2024-07-12 10:55:26.752654] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:51.644 [2024-07-12 10:55:26.752698] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:51.644 [2024-07-12 10:55:26.752719] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:51.644 [2024-07-12 10:55:26.752727] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.644 10:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.901 10:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:51.901 "name": "raid_bdev1", 00:28:51.901 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:51.901 "strip_size_kb": 0, 00:28:51.901 "state": "online", 00:28:51.901 "raid_level": "raid1", 00:28:51.901 "superblock": true, 00:28:51.901 "num_base_bdevs": 2, 00:28:51.902 "num_base_bdevs_discovered": 1, 00:28:51.902 "num_base_bdevs_operational": 1, 00:28:51.902 "base_bdevs_list": [ 00:28:51.902 { 00:28:51.902 "name": null, 00:28:51.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:51.902 "is_configured": false, 00:28:51.902 "data_offset": 256, 00:28:51.902 "data_size": 7936 00:28:51.902 }, 00:28:51.902 { 00:28:51.902 "name": "BaseBdev2", 00:28:51.902 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:51.902 "is_configured": true, 00:28:51.902 "data_offset": 256, 00:28:51.902 "data_size": 7936 00:28:51.902 } 00:28:51.902 ] 00:28:51.902 }' 00:28:51.902 10:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:51.902 10:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:52.466 10:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:52.724 [2024-07-12 10:55:27.791177] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:52.724 [2024-07-12 10:55:27.791225] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:52.724 [2024-07-12 10:55:27.791246] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110ec80 00:28:52.724 [2024-07-12 10:55:27.791259] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:52.724 [2024-07-12 10:55:27.791444] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:52.724 [2024-07-12 10:55:27.791459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:52.724 [2024-07-12 10:55:27.791523] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:52.724 [2024-07-12 10:55:27.791535] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:52.724 [2024-07-12 10:55:27.791545] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:52.724 [2024-07-12 10:55:27.791563] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:52.724 [2024-07-12 10:55:27.795041] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x110f0a0 00:28:52.724 [2024-07-12 10:55:27.796370] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:52.724 spare 00:28:52.724 10:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:53.657 10:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:53.657 10:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:53.657 10:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:53.657 10:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:53.657 10:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:53.657 10:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:53.657 10:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:53.914 10:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:53.914 "name": "raid_bdev1", 00:28:53.914 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:53.914 "strip_size_kb": 0, 00:28:53.914 "state": "online", 00:28:53.914 "raid_level": "raid1", 00:28:53.914 "superblock": true, 00:28:53.914 "num_base_bdevs": 2, 00:28:53.914 "num_base_bdevs_discovered": 2, 00:28:53.914 "num_base_bdevs_operational": 2, 00:28:53.914 "process": { 00:28:53.914 "type": "rebuild", 00:28:53.914 "target": "spare", 00:28:53.914 "progress": { 00:28:53.914 "blocks": 2816, 00:28:53.914 "percent": 35 00:28:53.914 } 00:28:53.914 }, 00:28:53.914 "base_bdevs_list": [ 00:28:53.914 { 00:28:53.914 "name": "spare", 00:28:53.914 "uuid": "b32dced3-57ba-5035-8472-b4efe59f5d17", 00:28:53.914 "is_configured": true, 00:28:53.914 "data_offset": 256, 00:28:53.914 "data_size": 7936 00:28:53.914 }, 00:28:53.914 { 00:28:53.914 "name": "BaseBdev2", 00:28:53.914 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:53.914 "is_configured": true, 00:28:53.914 "data_offset": 256, 00:28:53.914 "data_size": 7936 00:28:53.914 } 00:28:53.914 ] 00:28:53.914 }' 00:28:53.914 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:53.914 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:53.914 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:53.914 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:53.914 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:54.172 [2024-07-12 10:55:29.314004] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:54.430 [2024-07-12 10:55:29.408917] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:54.430 [2024-07-12 10:55:29.408961] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:54.430 [2024-07-12 10:55:29.408977] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:54.430 [2024-07-12 10:55:29.408985] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.430 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:54.687 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:54.687 "name": "raid_bdev1", 00:28:54.687 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:54.687 "strip_size_kb": 0, 00:28:54.687 "state": "online", 00:28:54.687 "raid_level": "raid1", 00:28:54.687 "superblock": true, 00:28:54.687 "num_base_bdevs": 2, 00:28:54.687 "num_base_bdevs_discovered": 1, 00:28:54.688 "num_base_bdevs_operational": 1, 00:28:54.688 "base_bdevs_list": [ 00:28:54.688 { 00:28:54.688 "name": null, 00:28:54.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:54.688 "is_configured": false, 00:28:54.688 "data_offset": 256, 00:28:54.688 "data_size": 7936 00:28:54.688 }, 00:28:54.688 { 00:28:54.688 "name": "BaseBdev2", 00:28:54.688 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:54.688 "is_configured": true, 00:28:54.688 "data_offset": 256, 00:28:54.688 "data_size": 7936 00:28:54.688 } 00:28:54.688 ] 00:28:54.688 }' 00:28:54.688 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:54.688 10:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:55.253 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:55.253 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:55.253 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:55.253 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:55.253 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:55.253 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.253 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:55.511 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:55.511 "name": "raid_bdev1", 00:28:55.511 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:55.511 "strip_size_kb": 0, 00:28:55.511 "state": "online", 00:28:55.511 "raid_level": "raid1", 00:28:55.511 "superblock": true, 00:28:55.511 "num_base_bdevs": 2, 00:28:55.511 "num_base_bdevs_discovered": 1, 00:28:55.511 "num_base_bdevs_operational": 1, 00:28:55.511 "base_bdevs_list": [ 00:28:55.511 { 00:28:55.511 "name": null, 00:28:55.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:55.511 "is_configured": false, 00:28:55.511 "data_offset": 256, 00:28:55.511 "data_size": 7936 00:28:55.511 }, 00:28:55.511 { 00:28:55.511 "name": "BaseBdev2", 00:28:55.511 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:55.511 "is_configured": true, 00:28:55.511 "data_offset": 256, 00:28:55.511 "data_size": 7936 00:28:55.511 } 00:28:55.511 ] 00:28:55.511 }' 00:28:55.511 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:55.511 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:55.511 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:55.511 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:55.511 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:55.769 10:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:56.027 [2024-07-12 10:55:31.085170] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:56.027 [2024-07-12 10:55:31.085216] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:56.027 [2024-07-12 10:55:31.085237] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110e510 00:28:56.027 [2024-07-12 10:55:31.085250] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:56.027 [2024-07-12 10:55:31.085411] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:56.027 [2024-07-12 10:55:31.085427] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:56.027 [2024-07-12 10:55:31.085470] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:56.027 [2024-07-12 10:55:31.085489] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:56.027 [2024-07-12 10:55:31.085500] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:56.027 BaseBdev1 00:28:56.027 10:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.961 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:57.218 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:57.218 "name": "raid_bdev1", 00:28:57.219 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:57.219 "strip_size_kb": 0, 00:28:57.219 "state": "online", 00:28:57.219 "raid_level": "raid1", 00:28:57.219 "superblock": true, 00:28:57.219 "num_base_bdevs": 2, 00:28:57.219 "num_base_bdevs_discovered": 1, 00:28:57.219 "num_base_bdevs_operational": 1, 00:28:57.219 "base_bdevs_list": [ 00:28:57.219 { 00:28:57.219 "name": null, 00:28:57.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:57.219 "is_configured": false, 00:28:57.219 "data_offset": 256, 00:28:57.219 "data_size": 7936 00:28:57.219 }, 00:28:57.219 { 00:28:57.219 "name": "BaseBdev2", 00:28:57.219 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:57.219 "is_configured": true, 00:28:57.219 "data_offset": 256, 00:28:57.219 "data_size": 7936 00:28:57.219 } 00:28:57.219 ] 00:28:57.219 }' 00:28:57.219 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:57.219 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:57.784 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:57.784 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:57.784 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:57.784 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:57.784 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:57.784 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.784 10:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:58.041 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:58.041 "name": "raid_bdev1", 00:28:58.041 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:58.041 "strip_size_kb": 0, 00:28:58.041 "state": "online", 00:28:58.041 "raid_level": "raid1", 00:28:58.041 "superblock": true, 00:28:58.041 "num_base_bdevs": 2, 00:28:58.041 "num_base_bdevs_discovered": 1, 00:28:58.041 "num_base_bdevs_operational": 1, 00:28:58.041 "base_bdevs_list": [ 00:28:58.041 { 00:28:58.041 "name": null, 00:28:58.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:58.041 "is_configured": false, 00:28:58.041 "data_offset": 256, 00:28:58.041 "data_size": 7936 00:28:58.041 }, 00:28:58.041 { 00:28:58.041 "name": "BaseBdev2", 00:28:58.041 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:58.041 "is_configured": true, 00:28:58.041 "data_offset": 256, 00:28:58.041 "data_size": 7936 00:28:58.041 } 00:28:58.041 ] 00:28:58.041 }' 00:28:58.041 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:58.299 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:58.556 [2024-07-12 10:55:33.527672] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:58.556 [2024-07-12 10:55:33.527791] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:58.556 [2024-07-12 10:55:33.527806] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:58.556 request: 00:28:58.556 { 00:28:58.556 "base_bdev": "BaseBdev1", 00:28:58.556 "raid_bdev": "raid_bdev1", 00:28:58.556 "method": "bdev_raid_add_base_bdev", 00:28:58.556 "req_id": 1 00:28:58.556 } 00:28:58.556 Got JSON-RPC error response 00:28:58.556 response: 00:28:58.556 { 00:28:58.556 "code": -22, 00:28:58.556 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:58.556 } 00:28:58.556 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:28:58.556 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:58.556 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:58.556 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:58.556 10:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.488 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:59.745 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:59.745 "name": "raid_bdev1", 00:28:59.745 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:28:59.745 "strip_size_kb": 0, 00:28:59.745 "state": "online", 00:28:59.745 "raid_level": "raid1", 00:28:59.745 "superblock": true, 00:28:59.745 "num_base_bdevs": 2, 00:28:59.745 "num_base_bdevs_discovered": 1, 00:28:59.745 "num_base_bdevs_operational": 1, 00:28:59.745 "base_bdevs_list": [ 00:28:59.745 { 00:28:59.745 "name": null, 00:28:59.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:59.745 "is_configured": false, 00:28:59.745 "data_offset": 256, 00:28:59.745 "data_size": 7936 00:28:59.745 }, 00:28:59.745 { 00:28:59.745 "name": "BaseBdev2", 00:28:59.745 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:28:59.745 "is_configured": true, 00:28:59.745 "data_offset": 256, 00:28:59.745 "data_size": 7936 00:28:59.745 } 00:28:59.745 ] 00:28:59.745 }' 00:28:59.745 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:59.745 10:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:00.311 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:00.311 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:00.311 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:00.311 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:00.311 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:00.311 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.311 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.569 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:00.569 "name": "raid_bdev1", 00:29:00.569 "uuid": "482b13ac-02d8-4c53-8756-91441a70eb6f", 00:29:00.569 "strip_size_kb": 0, 00:29:00.569 "state": "online", 00:29:00.569 "raid_level": "raid1", 00:29:00.569 "superblock": true, 00:29:00.569 "num_base_bdevs": 2, 00:29:00.569 "num_base_bdevs_discovered": 1, 00:29:00.569 "num_base_bdevs_operational": 1, 00:29:00.569 "base_bdevs_list": [ 00:29:00.569 { 00:29:00.569 "name": null, 00:29:00.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:00.570 "is_configured": false, 00:29:00.570 "data_offset": 256, 00:29:00.570 "data_size": 7936 00:29:00.570 }, 00:29:00.570 { 00:29:00.570 "name": "BaseBdev2", 00:29:00.570 "uuid": "df02b651-586f-52ee-8b2c-7dc2cbee1bfc", 00:29:00.570 "is_configured": true, 00:29:00.570 "data_offset": 256, 00:29:00.570 "data_size": 7936 00:29:00.570 } 00:29:00.570 ] 00:29:00.570 }' 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 2174286 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2174286 ']' 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2174286 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2174286 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2174286' 00:29:00.570 killing process with pid 2174286 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2174286 00:29:00.570 Received shutdown signal, test time was about 60.000000 seconds 00:29:00.570 00:29:00.570 Latency(us) 00:29:00.570 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:00.570 =================================================================================================================== 00:29:00.570 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:00.570 [2024-07-12 10:55:35.746062] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:00.570 [2024-07-12 10:55:35.746148] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:00.570 [2024-07-12 10:55:35.746190] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:00.570 [2024-07-12 10:55:35.746201] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x110f810 name raid_bdev1, state offline 00:29:00.570 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2174286 00:29:00.828 [2024-07-12 10:55:35.773315] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:00.828 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:29:00.828 00:29:00.828 real 0m26.840s 00:29:00.828 user 0m43.533s 00:29:00.828 sys 0m3.747s 00:29:00.828 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:00.828 10:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:00.828 ************************************ 00:29:00.828 END TEST raid_rebuild_test_sb_md_interleaved 00:29:00.828 ************************************ 00:29:00.828 10:55:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:00.828 10:55:36 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:29:00.828 10:55:36 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:29:00.828 10:55:36 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 2174286 ']' 00:29:00.828 10:55:36 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 2174286 00:29:01.086 10:55:36 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:29:01.086 00:29:01.086 real 18m21.133s 00:29:01.086 user 31m7.615s 00:29:01.086 sys 3m20.144s 00:29:01.086 10:55:36 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:01.086 10:55:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:01.086 ************************************ 00:29:01.086 END TEST bdev_raid 00:29:01.086 ************************************ 00:29:01.086 10:55:36 -- common/autotest_common.sh@1142 -- # return 0 00:29:01.086 10:55:36 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:01.086 10:55:36 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:01.086 10:55:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:01.086 10:55:36 -- common/autotest_common.sh@10 -- # set +x 00:29:01.086 ************************************ 00:29:01.086 START TEST bdevperf_config 00:29:01.086 ************************************ 00:29:01.086 10:55:36 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:01.086 * Looking for test storage... 00:29:01.086 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:01.086 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:01.086 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:01.086 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:01.086 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:01.086 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:01.086 10:55:36 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:04.366 10:55:38 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-12 10:55:36.308116] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:04.366 [2024-07-12 10:55:36.308183] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2178205 ] 00:29:04.366 Using job config with 4 jobs 00:29:04.366 [2024-07-12 10:55:36.436228] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:04.366 [2024-07-12 10:55:36.550225] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:04.366 cpumask for '\''job0'\'' is too big 00:29:04.366 cpumask for '\''job1'\'' is too big 00:29:04.366 cpumask for '\''job2'\'' is too big 00:29:04.366 cpumask for '\''job3'\'' is too big 00:29:04.366 Running I/O for 2 seconds... 00:29:04.366 00:29:04.366 Latency(us) 00:29:04.366 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.02 23946.51 23.39 0.00 0.00 10677.25 1852.10 16298.52 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.02 23924.49 23.36 0.00 0.00 10663.36 1852.10 14417.92 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.02 23902.56 23.34 0.00 0.00 10648.63 1837.86 12594.31 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.03 23880.70 23.32 0.00 0.00 10634.77 1837.86 10941.66 00:29:04.366 =================================================================================================================== 00:29:04.366 Total : 95654.27 93.41 0.00 0.00 10656.00 1837.86 16298.52' 00:29:04.366 10:55:38 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-12 10:55:36.308116] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:04.366 [2024-07-12 10:55:36.308183] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2178205 ] 00:29:04.366 Using job config with 4 jobs 00:29:04.366 [2024-07-12 10:55:36.436228] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:04.366 [2024-07-12 10:55:36.550225] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:04.366 cpumask for '\''job0'\'' is too big 00:29:04.366 cpumask for '\''job1'\'' is too big 00:29:04.366 cpumask for '\''job2'\'' is too big 00:29:04.366 cpumask for '\''job3'\'' is too big 00:29:04.366 Running I/O for 2 seconds... 00:29:04.366 00:29:04.366 Latency(us) 00:29:04.366 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.02 23946.51 23.39 0.00 0.00 10677.25 1852.10 16298.52 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.02 23924.49 23.36 0.00 0.00 10663.36 1852.10 14417.92 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.02 23902.56 23.34 0.00 0.00 10648.63 1837.86 12594.31 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.03 23880.70 23.32 0.00 0.00 10634.77 1837.86 10941.66 00:29:04.366 =================================================================================================================== 00:29:04.366 Total : 95654.27 93.41 0.00 0.00 10656.00 1837.86 16298.52' 00:29:04.366 10:55:38 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:04.366 10:55:38 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 10:55:36.308116] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:04.366 [2024-07-12 10:55:36.308183] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2178205 ] 00:29:04.366 Using job config with 4 jobs 00:29:04.366 [2024-07-12 10:55:36.436228] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:04.366 [2024-07-12 10:55:36.550225] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:04.366 cpumask for '\''job0'\'' is too big 00:29:04.366 cpumask for '\''job1'\'' is too big 00:29:04.366 cpumask for '\''job2'\'' is too big 00:29:04.366 cpumask for '\''job3'\'' is too big 00:29:04.366 Running I/O for 2 seconds... 00:29:04.366 00:29:04.366 Latency(us) 00:29:04.366 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.02 23946.51 23.39 0.00 0.00 10677.25 1852.10 16298.52 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.02 23924.49 23.36 0.00 0.00 10663.36 1852.10 14417.92 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.02 23902.56 23.34 0.00 0.00 10648.63 1837.86 12594.31 00:29:04.366 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:04.366 Malloc0 : 2.03 23880.70 23.32 0.00 0.00 10634.77 1837.86 10941.66 00:29:04.366 =================================================================================================================== 00:29:04.366 Total : 95654.27 93.41 0.00 0.00 10656.00 1837.86 16298.52' 00:29:04.366 10:55:38 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:04.366 10:55:38 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:29:04.366 10:55:38 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:04.366 [2024-07-12 10:55:39.051986] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:04.366 [2024-07-12 10:55:39.052050] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2178555 ] 00:29:04.366 [2024-07-12 10:55:39.190560] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:04.366 [2024-07-12 10:55:39.309041] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:04.366 cpumask for 'job0' is too big 00:29:04.366 cpumask for 'job1' is too big 00:29:04.366 cpumask for 'job2' is too big 00:29:04.366 cpumask for 'job3' is too big 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:29:06.892 Running I/O for 2 seconds... 00:29:06.892 00:29:06.892 Latency(us) 00:29:06.892 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:06.892 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:06.892 Malloc0 : 2.02 23882.05 23.32 0.00 0.00 10706.91 1880.60 16412.49 00:29:06.892 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:06.892 Malloc0 : 2.02 23860.02 23.30 0.00 0.00 10691.91 1852.10 14531.90 00:29:06.892 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:06.892 Malloc0 : 2.02 23901.03 23.34 0.00 0.00 10649.61 1866.35 12651.30 00:29:06.892 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:06.892 Malloc0 : 2.03 23879.15 23.32 0.00 0.00 10635.62 1852.10 12252.38 00:29:06.892 =================================================================================================================== 00:29:06.892 Total : 95522.25 93.28 0.00 0.00 10670.94 1852.10 16412.49' 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:06.892 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:06.892 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:06.892 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:06.892 10:55:41 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:09.419 10:55:44 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-12 10:55:41.842473] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:09.419 [2024-07-12 10:55:41.842572] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2178936 ] 00:29:09.419 Using job config with 3 jobs 00:29:09.419 [2024-07-12 10:55:41.984810] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:09.419 [2024-07-12 10:55:42.103577] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:09.419 cpumask for '\''job0'\'' is too big 00:29:09.419 cpumask for '\''job1'\'' is too big 00:29:09.419 cpumask for '\''job2'\'' is too big 00:29:09.419 Running I/O for 2 seconds... 00:29:09.419 00:29:09.419 Latency(us) 00:29:09.419 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:09.419 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:09.419 Malloc0 : 2.01 32233.87 31.48 0.00 0.00 7940.67 1823.61 11625.52 00:29:09.419 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:09.419 Malloc0 : 2.02 32242.68 31.49 0.00 0.00 7920.91 1787.99 9801.91 00:29:09.419 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:09.419 Malloc0 : 2.02 32213.14 31.46 0.00 0.00 7910.97 1787.99 8206.25 00:29:09.419 =================================================================================================================== 00:29:09.419 Total : 96689.69 94.42 0.00 0.00 7924.16 1787.99 11625.52' 00:29:09.419 10:55:44 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-12 10:55:41.842473] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:09.419 [2024-07-12 10:55:41.842572] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2178936 ] 00:29:09.419 Using job config with 3 jobs 00:29:09.419 [2024-07-12 10:55:41.984810] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:09.419 [2024-07-12 10:55:42.103577] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:09.419 cpumask for '\''job0'\'' is too big 00:29:09.419 cpumask for '\''job1'\'' is too big 00:29:09.419 cpumask for '\''job2'\'' is too big 00:29:09.419 Running I/O for 2 seconds... 00:29:09.419 00:29:09.419 Latency(us) 00:29:09.419 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:09.419 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:09.419 Malloc0 : 2.01 32233.87 31.48 0.00 0.00 7940.67 1823.61 11625.52 00:29:09.419 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:09.419 Malloc0 : 2.02 32242.68 31.49 0.00 0.00 7920.91 1787.99 9801.91 00:29:09.419 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:09.419 Malloc0 : 2.02 32213.14 31.46 0.00 0.00 7910.97 1787.99 8206.25 00:29:09.419 =================================================================================================================== 00:29:09.419 Total : 96689.69 94.42 0.00 0.00 7924.16 1787.99 11625.52' 00:29:09.419 10:55:44 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 10:55:41.842473] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:09.419 [2024-07-12 10:55:41.842572] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2178936 ] 00:29:09.419 Using job config with 3 jobs 00:29:09.420 [2024-07-12 10:55:41.984810] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:09.420 [2024-07-12 10:55:42.103577] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:09.420 cpumask for '\''job0'\'' is too big 00:29:09.420 cpumask for '\''job1'\'' is too big 00:29:09.420 cpumask for '\''job2'\'' is too big 00:29:09.420 Running I/O for 2 seconds... 00:29:09.420 00:29:09.420 Latency(us) 00:29:09.420 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:09.420 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:09.420 Malloc0 : 2.01 32233.87 31.48 0.00 0.00 7940.67 1823.61 11625.52 00:29:09.420 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:09.420 Malloc0 : 2.02 32242.68 31.49 0.00 0.00 7920.91 1787.99 9801.91 00:29:09.420 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:09.420 Malloc0 : 2.02 32213.14 31.46 0.00 0.00 7910.97 1787.99 8206.25 00:29:09.420 =================================================================================================================== 00:29:09.420 Total : 96689.69 94.42 0.00 0.00 7924.16 1787.99 11625.52' 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:09.420 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:09.420 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:09.420 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:09.420 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:09.420 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:09.420 10:55:44 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:12.729 10:55:47 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-12 10:55:44.598805] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:12.729 [2024-07-12 10:55:44.598876] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2179325 ] 00:29:12.729 Using job config with 4 jobs 00:29:12.729 [2024-07-12 10:55:44.739424] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:12.729 [2024-07-12 10:55:44.851065] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:12.729 cpumask for '\''job0'\'' is too big 00:29:12.729 cpumask for '\''job1'\'' is too big 00:29:12.729 cpumask for '\''job2'\'' is too big 00:29:12.729 cpumask for '\''job3'\'' is too big 00:29:12.729 Running I/O for 2 seconds... 00:29:12.729 00:29:12.729 Latency(us) 00:29:12.729 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:12.729 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.729 Malloc0 : 2.03 11967.17 11.69 0.00 0.00 21370.25 3875.17 33052.94 00:29:12.729 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.729 Malloc1 : 2.03 11955.93 11.68 0.00 0.00 21370.68 4701.50 33052.94 00:29:12.729 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.729 Malloc0 : 2.04 11945.00 11.67 0.00 0.00 21310.59 3789.69 29177.77 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.04 11933.75 11.65 0.00 0.00 21311.18 4616.01 29177.77 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.04 11922.92 11.64 0.00 0.00 21254.17 3789.69 25416.57 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.04 11911.84 11.63 0.00 0.00 21254.18 4644.51 25302.59 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.05 11994.34 11.71 0.00 0.00 21034.28 3647.22 21655.37 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.05 11983.20 11.70 0.00 0.00 21033.27 2863.64 21655.37 00:29:12.730 =================================================================================================================== 00:29:12.730 Total : 95614.17 93.37 0.00 0.00 21241.78 2863.64 33052.94' 00:29:12.730 10:55:47 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-12 10:55:44.598805] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:12.730 [2024-07-12 10:55:44.598876] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2179325 ] 00:29:12.730 Using job config with 4 jobs 00:29:12.730 [2024-07-12 10:55:44.739424] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:12.730 [2024-07-12 10:55:44.851065] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:12.730 cpumask for '\''job0'\'' is too big 00:29:12.730 cpumask for '\''job1'\'' is too big 00:29:12.730 cpumask for '\''job2'\'' is too big 00:29:12.730 cpumask for '\''job3'\'' is too big 00:29:12.730 Running I/O for 2 seconds... 00:29:12.730 00:29:12.730 Latency(us) 00:29:12.730 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.03 11967.17 11.69 0.00 0.00 21370.25 3875.17 33052.94 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.03 11955.93 11.68 0.00 0.00 21370.68 4701.50 33052.94 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.04 11945.00 11.67 0.00 0.00 21310.59 3789.69 29177.77 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.04 11933.75 11.65 0.00 0.00 21311.18 4616.01 29177.77 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.04 11922.92 11.64 0.00 0.00 21254.17 3789.69 25416.57 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.04 11911.84 11.63 0.00 0.00 21254.18 4644.51 25302.59 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.05 11994.34 11.71 0.00 0.00 21034.28 3647.22 21655.37 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.05 11983.20 11.70 0.00 0.00 21033.27 2863.64 21655.37 00:29:12.730 =================================================================================================================== 00:29:12.730 Total : 95614.17 93.37 0.00 0.00 21241.78 2863.64 33052.94' 00:29:12.730 10:55:47 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 10:55:44.598805] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:12.730 [2024-07-12 10:55:44.598876] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2179325 ] 00:29:12.730 Using job config with 4 jobs 00:29:12.730 [2024-07-12 10:55:44.739424] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:12.730 [2024-07-12 10:55:44.851065] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:12.730 cpumask for '\''job0'\'' is too big 00:29:12.730 cpumask for '\''job1'\'' is too big 00:29:12.730 cpumask for '\''job2'\'' is too big 00:29:12.730 cpumask for '\''job3'\'' is too big 00:29:12.730 Running I/O for 2 seconds... 00:29:12.730 00:29:12.730 Latency(us) 00:29:12.730 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.03 11967.17 11.69 0.00 0.00 21370.25 3875.17 33052.94 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.03 11955.93 11.68 0.00 0.00 21370.68 4701.50 33052.94 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.04 11945.00 11.67 0.00 0.00 21310.59 3789.69 29177.77 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.04 11933.75 11.65 0.00 0.00 21311.18 4616.01 29177.77 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.04 11922.92 11.64 0.00 0.00 21254.17 3789.69 25416.57 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.04 11911.84 11.63 0.00 0.00 21254.18 4644.51 25302.59 00:29:12.730 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc0 : 2.05 11994.34 11.71 0.00 0.00 21034.28 3647.22 21655.37 00:29:12.730 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:12.730 Malloc1 : 2.05 11983.20 11.70 0.00 0.00 21033.27 2863.64 21655.37 00:29:12.730 =================================================================================================================== 00:29:12.730 Total : 95614.17 93.37 0.00 0.00 21241.78 2863.64 33052.94' 00:29:12.730 10:55:47 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:12.730 10:55:47 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:12.730 10:55:47 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:29:12.730 10:55:47 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:29:12.730 10:55:47 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:12.730 10:55:47 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:29:12.730 00:29:12.730 real 0m11.199s 00:29:12.730 user 0m9.872s 00:29:12.730 sys 0m1.183s 00:29:12.730 10:55:47 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:12.730 10:55:47 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:29:12.730 ************************************ 00:29:12.730 END TEST bdevperf_config 00:29:12.730 ************************************ 00:29:12.730 10:55:47 -- common/autotest_common.sh@1142 -- # return 0 00:29:12.730 10:55:47 -- spdk/autotest.sh@192 -- # uname -s 00:29:12.730 10:55:47 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:29:12.730 10:55:47 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:12.730 10:55:47 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:12.730 10:55:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:12.730 10:55:47 -- common/autotest_common.sh@10 -- # set +x 00:29:12.730 ************************************ 00:29:12.730 START TEST reactor_set_interrupt 00:29:12.730 ************************************ 00:29:12.730 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:12.730 * Looking for test storage... 00:29:12.730 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:12.730 10:55:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:29:12.730 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:12.731 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:12.731 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:12.731 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:29:12.731 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:12.731 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:29:12.731 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:29:12.731 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:29:12.731 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:29:12.731 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:29:12.731 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:29:12.731 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:29:12.731 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:29:12.731 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:29:12.731 10:55:47 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:29:12.732 10:55:47 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:29:12.732 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:29:12.732 #define SPDK_CONFIG_H 00:29:12.732 #define SPDK_CONFIG_APPS 1 00:29:12.732 #define SPDK_CONFIG_ARCH native 00:29:12.732 #undef SPDK_CONFIG_ASAN 00:29:12.732 #undef SPDK_CONFIG_AVAHI 00:29:12.732 #undef SPDK_CONFIG_CET 00:29:12.732 #define SPDK_CONFIG_COVERAGE 1 00:29:12.732 #define SPDK_CONFIG_CROSS_PREFIX 00:29:12.732 #define SPDK_CONFIG_CRYPTO 1 00:29:12.732 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:29:12.732 #undef SPDK_CONFIG_CUSTOMOCF 00:29:12.732 #undef SPDK_CONFIG_DAOS 00:29:12.732 #define SPDK_CONFIG_DAOS_DIR 00:29:12.732 #define SPDK_CONFIG_DEBUG 1 00:29:12.732 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:29:12.732 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:12.732 #define SPDK_CONFIG_DPDK_INC_DIR 00:29:12.732 #define SPDK_CONFIG_DPDK_LIB_DIR 00:29:12.732 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:29:12.732 #undef SPDK_CONFIG_DPDK_UADK 00:29:12.732 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:12.732 #define SPDK_CONFIG_EXAMPLES 1 00:29:12.732 #undef SPDK_CONFIG_FC 00:29:12.732 #define SPDK_CONFIG_FC_PATH 00:29:12.732 #define SPDK_CONFIG_FIO_PLUGIN 1 00:29:12.732 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:29:12.732 #undef SPDK_CONFIG_FUSE 00:29:12.732 #undef SPDK_CONFIG_FUZZER 00:29:12.732 #define SPDK_CONFIG_FUZZER_LIB 00:29:12.732 #undef SPDK_CONFIG_GOLANG 00:29:12.732 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:29:12.732 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:29:12.732 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:29:12.732 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:29:12.732 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:29:12.732 #undef SPDK_CONFIG_HAVE_LIBBSD 00:29:12.732 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:29:12.732 #define SPDK_CONFIG_IDXD 1 00:29:12.732 #define SPDK_CONFIG_IDXD_KERNEL 1 00:29:12.732 #define SPDK_CONFIG_IPSEC_MB 1 00:29:12.732 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:12.732 #define SPDK_CONFIG_ISAL 1 00:29:12.732 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:29:12.732 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:29:12.732 #define SPDK_CONFIG_LIBDIR 00:29:12.732 #undef SPDK_CONFIG_LTO 00:29:12.732 #define SPDK_CONFIG_MAX_LCORES 128 00:29:12.732 #define SPDK_CONFIG_NVME_CUSE 1 00:29:12.732 #undef SPDK_CONFIG_OCF 00:29:12.732 #define SPDK_CONFIG_OCF_PATH 00:29:12.732 #define SPDK_CONFIG_OPENSSL_PATH 00:29:12.732 #undef SPDK_CONFIG_PGO_CAPTURE 00:29:12.732 #define SPDK_CONFIG_PGO_DIR 00:29:12.732 #undef SPDK_CONFIG_PGO_USE 00:29:12.732 #define SPDK_CONFIG_PREFIX /usr/local 00:29:12.732 #undef SPDK_CONFIG_RAID5F 00:29:12.732 #undef SPDK_CONFIG_RBD 00:29:12.732 #define SPDK_CONFIG_RDMA 1 00:29:12.732 #define SPDK_CONFIG_RDMA_PROV verbs 00:29:12.732 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:29:12.732 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:29:12.732 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:29:12.732 #define SPDK_CONFIG_SHARED 1 00:29:12.732 #undef SPDK_CONFIG_SMA 00:29:12.732 #define SPDK_CONFIG_TESTS 1 00:29:12.732 #undef SPDK_CONFIG_TSAN 00:29:12.732 #define SPDK_CONFIG_UBLK 1 00:29:12.732 #define SPDK_CONFIG_UBSAN 1 00:29:12.732 #undef SPDK_CONFIG_UNIT_TESTS 00:29:12.732 #undef SPDK_CONFIG_URING 00:29:12.732 #define SPDK_CONFIG_URING_PATH 00:29:12.732 #undef SPDK_CONFIG_URING_ZNS 00:29:12.732 #undef SPDK_CONFIG_USDT 00:29:12.732 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:29:12.732 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:29:12.732 #undef SPDK_CONFIG_VFIO_USER 00:29:12.732 #define SPDK_CONFIG_VFIO_USER_DIR 00:29:12.732 #define SPDK_CONFIG_VHOST 1 00:29:12.732 #define SPDK_CONFIG_VIRTIO 1 00:29:12.732 #undef SPDK_CONFIG_VTUNE 00:29:12.732 #define SPDK_CONFIG_VTUNE_DIR 00:29:12.732 #define SPDK_CONFIG_WERROR 1 00:29:12.732 #define SPDK_CONFIG_WPDK_DIR 00:29:12.732 #undef SPDK_CONFIG_XNVME 00:29:12.732 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:29:12.732 10:55:47 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:29:12.732 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:12.732 10:55:47 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:12.732 10:55:47 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:12.732 10:55:47 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:12.732 10:55:47 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:12.732 10:55:47 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:12.732 10:55:47 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:12.732 10:55:47 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:29:12.732 10:55:47 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:12.732 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:12.732 10:55:47 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:12.732 10:55:47 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:12.732 10:55:47 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:12.732 10:55:47 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:29:12.732 10:55:47 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:12.732 10:55:47 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:29:12.732 10:55:47 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:29:12.732 10:55:47 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:29:12.732 10:55:47 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:29:12.733 10:55:47 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:29:12.733 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:29:12.734 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 2179712 ]] 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 2179712 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.aae5xX 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.aae5xX/tests/interrupt /tmp/spdk.aae5xX 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88735903744 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508544000 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5772640256 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249559552 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254269952 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892328960 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901708800 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9379840 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253569536 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254274048 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=704512 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:12.735 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:29:12.736 * Looking for test storage... 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88735903744 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7987232768 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:12.736 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2179759 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:12.736 10:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2179759 /var/tmp/spdk.sock 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2179759 ']' 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:12.736 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:12.736 10:55:47 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:12.736 [2024-07-12 10:55:47.703766] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:12.736 [2024-07-12 10:55:47.703842] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2179759 ] 00:29:12.736 [2024-07-12 10:55:47.833999] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:12.995 [2024-07-12 10:55:47.936872] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:12.995 [2024-07-12 10:55:47.936958] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:12.995 [2024-07-12 10:55:47.936964] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:12.995 [2024-07-12 10:55:48.010839] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:13.561 10:55:48 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:13.561 10:55:48 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:29:13.561 10:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:29:13.561 10:55:48 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:13.820 Malloc0 00:29:13.820 Malloc1 00:29:13.820 Malloc2 00:29:13.820 10:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:29:13.820 10:55:48 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:29:13.820 10:55:49 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:13.820 10:55:49 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:14.079 5000+0 records in 00:29:14.079 5000+0 records out 00:29:14.079 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0234037 s, 438 MB/s 00:29:14.079 10:55:49 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:14.079 AIO0 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 2179759 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 2179759 without_thd 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2179759 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:14.337 10:55:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:14.596 10:55:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:29:14.596 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:29:14.596 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:29:14.596 10:55:49 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:29:14.596 10:55:49 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:14.596 10:55:49 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:29:14.596 10:55:49 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:14.596 10:55:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:14.596 10:55:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:29:14.854 spdk_thread ids are 1 on reactor0. 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2179759 0 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2179759 0 idle 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2179759 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2179759 -w 256 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2179759 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:00.41 reactor_0' 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2179759 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:00.41 reactor_0 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2179759 1 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2179759 1 idle 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2179759 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2179759 -w 256 00:29:14.854 10:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2179764 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2179764 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2179759 2 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2179759 2 idle 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2179759 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2179759 -w 256 00:29:15.112 10:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2179765 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2179765 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:29:15.371 10:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:29:15.631 [2024-07-12 10:55:50.570191] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:15.631 10:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:29:15.631 [2024-07-12 10:55:50.817562] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:29:15.631 [2024-07-12 10:55:50.821473] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:15.889 10:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:29:15.889 [2024-07-12 10:55:51.069474] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:29:15.889 [2024-07-12 10:55:51.069606] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2179759 0 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2179759 0 busy 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2179759 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2179759 -w 256 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2179759 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.84 reactor_0' 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2179759 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.84 reactor_0 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2179759 2 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2179759 2 busy 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2179759 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2179759 -w 256 00:29:16.146 10:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2179765 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2179765 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:16.404 10:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:29:16.662 [2024-07-12 10:55:51.673471] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:29:16.662 [2024-07-12 10:55:51.673581] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2179759 2 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2179759 2 idle 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2179759 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2179759 -w 256 00:29:16.662 10:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2179765 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2179765 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:16.920 10:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:29:16.920 [2024-07-12 10:55:52.025459] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:29:16.920 [2024-07-12 10:55:52.025565] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:16.920 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:29:16.920 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:29:16.920 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:29:17.179 [2024-07-12 10:55:52.201912] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2179759 0 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2179759 0 idle 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2179759 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2179759 -w 256 00:29:17.179 10:55:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2179759 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.61 reactor_0' 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2179759 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.61 reactor_0 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:29:17.437 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 2179759 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2179759 ']' 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2179759 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2179759 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2179759' 00:29:17.437 killing process with pid 2179759 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2179759 00:29:17.437 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2179759 00:29:17.696 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:29:17.696 10:55:52 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:17.696 10:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:29:17.696 10:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:17.696 10:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:17.696 10:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2180475 00:29:17.696 10:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:17.696 10:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2180475 /var/tmp/spdk.sock 00:29:17.696 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2180475 ']' 00:29:17.696 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:17.696 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:17.696 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:17.696 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:17.696 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:17.696 10:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:17.696 10:55:52 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:17.696 [2024-07-12 10:55:52.755099] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:17.696 [2024-07-12 10:55:52.755164] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2180475 ] 00:29:17.696 [2024-07-12 10:55:52.884511] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:17.954 [2024-07-12 10:55:52.993568] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:17.954 [2024-07-12 10:55:52.993654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:17.954 [2024-07-12 10:55:52.993660] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:17.954 [2024-07-12 10:55:53.074835] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:18.519 10:55:53 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:18.519 10:55:53 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:29:18.519 10:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:29:18.519 10:55:53 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:18.776 Malloc0 00:29:18.776 Malloc1 00:29:18.776 Malloc2 00:29:18.776 10:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:29:18.776 10:55:53 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:29:18.776 10:55:53 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:18.776 10:55:53 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:18.776 5000+0 records in 00:29:18.776 5000+0 records out 00:29:18.776 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0275784 s, 371 MB/s 00:29:18.776 10:55:53 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:19.033 AIO0 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 2180475 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 2180475 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2180475 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:19.033 10:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:19.289 10:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:19.289 10:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:29:19.289 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:29:19.289 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:29:19.289 10:55:54 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:29:19.289 10:55:54 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:19.289 10:55:54 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:29:19.289 10:55:54 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:29:19.546 spdk_thread ids are 1 on reactor0. 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2180475 0 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2180475 0 idle 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2180475 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2180475 -w 256 00:29:19.546 10:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2180475 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.40 reactor_0' 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2180475 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.40 reactor_0 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2180475 1 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2180475 1 idle 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2180475 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2180475 -w 256 00:29:19.804 10:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2180520 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2180520 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2180475 2 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2180475 2 idle 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2180475 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2180475 -w 256 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2180521 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2180521 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:29:20.100 10:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:29:20.357 [2024-07-12 10:55:55.498274] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:29:20.357 [2024-07-12 10:55:55.498471] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:29:20.357 [2024-07-12 10:55:55.498587] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:20.357 10:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:29:20.613 [2024-07-12 10:55:55.746799] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:29:20.614 [2024-07-12 10:55:55.746964] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2180475 0 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2180475 0 busy 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2180475 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2180475 -w 256 00:29:20.614 10:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:20.870 10:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2180475 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.84 reactor_0' 00:29:20.870 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2180475 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.84 reactor_0 00:29:20.870 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:20.870 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:20.870 10:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:20.870 10:55:55 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2180475 2 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2180475 2 busy 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2180475 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2180475 -w 256 00:29:20.871 10:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2180521 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2180521 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:21.128 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:29:21.385 [2024-07-12 10:55:56.352491] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:29:21.385 [2024-07-12 10:55:56.352592] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2180475 2 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2180475 2 idle 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2180475 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2180475 -w 256 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2180521 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2180521 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:21.385 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:29:21.643 [2024-07-12 10:55:56.773576] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:29:21.643 [2024-07-12 10:55:56.773716] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:29:21.643 [2024-07-12 10:55:56.773740] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2180475 0 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2180475 0 idle 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2180475 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2180475 -w 256 00:29:21.643 10:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2180475 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.68 reactor_0' 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2180475 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.68 reactor_0 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:29:21.902 10:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 2180475 00:29:21.902 10:55:56 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2180475 ']' 00:29:21.902 10:55:56 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2180475 00:29:21.902 10:55:56 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:29:21.902 10:55:56 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:21.902 10:55:56 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2180475 00:29:21.902 10:55:57 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:21.902 10:55:57 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:21.902 10:55:57 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2180475' 00:29:21.902 killing process with pid 2180475 00:29:21.902 10:55:57 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2180475 00:29:21.902 10:55:57 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2180475 00:29:22.160 10:55:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:29:22.160 10:55:57 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:22.160 00:29:22.160 real 0m9.904s 00:29:22.160 user 0m8.975s 00:29:22.160 sys 0m2.417s 00:29:22.160 10:55:57 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:22.160 10:55:57 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:22.160 ************************************ 00:29:22.160 END TEST reactor_set_interrupt 00:29:22.160 ************************************ 00:29:22.160 10:55:57 -- common/autotest_common.sh@1142 -- # return 0 00:29:22.160 10:55:57 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:22.160 10:55:57 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:22.160 10:55:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:22.160 10:55:57 -- common/autotest_common.sh@10 -- # set +x 00:29:22.421 ************************************ 00:29:22.421 START TEST reap_unregistered_poller 00:29:22.421 ************************************ 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:22.421 * Looking for test storage... 00:29:22.421 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:22.421 10:55:57 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:29:22.421 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:22.421 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:22.421 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:22.421 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:29:22.421 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:22.421 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:29:22.421 10:55:57 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:29:22.421 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:22.421 10:55:57 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:29:22.422 #define SPDK_CONFIG_H 00:29:22.422 #define SPDK_CONFIG_APPS 1 00:29:22.422 #define SPDK_CONFIG_ARCH native 00:29:22.422 #undef SPDK_CONFIG_ASAN 00:29:22.422 #undef SPDK_CONFIG_AVAHI 00:29:22.422 #undef SPDK_CONFIG_CET 00:29:22.422 #define SPDK_CONFIG_COVERAGE 1 00:29:22.422 #define SPDK_CONFIG_CROSS_PREFIX 00:29:22.422 #define SPDK_CONFIG_CRYPTO 1 00:29:22.422 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:29:22.422 #undef SPDK_CONFIG_CUSTOMOCF 00:29:22.422 #undef SPDK_CONFIG_DAOS 00:29:22.422 #define SPDK_CONFIG_DAOS_DIR 00:29:22.422 #define SPDK_CONFIG_DEBUG 1 00:29:22.422 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:29:22.422 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:22.422 #define SPDK_CONFIG_DPDK_INC_DIR 00:29:22.422 #define SPDK_CONFIG_DPDK_LIB_DIR 00:29:22.422 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:29:22.422 #undef SPDK_CONFIG_DPDK_UADK 00:29:22.422 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:22.422 #define SPDK_CONFIG_EXAMPLES 1 00:29:22.422 #undef SPDK_CONFIG_FC 00:29:22.422 #define SPDK_CONFIG_FC_PATH 00:29:22.422 #define SPDK_CONFIG_FIO_PLUGIN 1 00:29:22.422 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:29:22.422 #undef SPDK_CONFIG_FUSE 00:29:22.422 #undef SPDK_CONFIG_FUZZER 00:29:22.422 #define SPDK_CONFIG_FUZZER_LIB 00:29:22.422 #undef SPDK_CONFIG_GOLANG 00:29:22.422 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:29:22.422 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:29:22.422 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:29:22.422 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:29:22.422 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:29:22.422 #undef SPDK_CONFIG_HAVE_LIBBSD 00:29:22.422 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:29:22.422 #define SPDK_CONFIG_IDXD 1 00:29:22.422 #define SPDK_CONFIG_IDXD_KERNEL 1 00:29:22.422 #define SPDK_CONFIG_IPSEC_MB 1 00:29:22.422 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:22.422 #define SPDK_CONFIG_ISAL 1 00:29:22.422 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:29:22.422 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:29:22.422 #define SPDK_CONFIG_LIBDIR 00:29:22.422 #undef SPDK_CONFIG_LTO 00:29:22.422 #define SPDK_CONFIG_MAX_LCORES 128 00:29:22.422 #define SPDK_CONFIG_NVME_CUSE 1 00:29:22.422 #undef SPDK_CONFIG_OCF 00:29:22.422 #define SPDK_CONFIG_OCF_PATH 00:29:22.422 #define SPDK_CONFIG_OPENSSL_PATH 00:29:22.422 #undef SPDK_CONFIG_PGO_CAPTURE 00:29:22.422 #define SPDK_CONFIG_PGO_DIR 00:29:22.422 #undef SPDK_CONFIG_PGO_USE 00:29:22.422 #define SPDK_CONFIG_PREFIX /usr/local 00:29:22.422 #undef SPDK_CONFIG_RAID5F 00:29:22.422 #undef SPDK_CONFIG_RBD 00:29:22.422 #define SPDK_CONFIG_RDMA 1 00:29:22.422 #define SPDK_CONFIG_RDMA_PROV verbs 00:29:22.422 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:29:22.422 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:29:22.422 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:29:22.422 #define SPDK_CONFIG_SHARED 1 00:29:22.422 #undef SPDK_CONFIG_SMA 00:29:22.422 #define SPDK_CONFIG_TESTS 1 00:29:22.422 #undef SPDK_CONFIG_TSAN 00:29:22.422 #define SPDK_CONFIG_UBLK 1 00:29:22.422 #define SPDK_CONFIG_UBSAN 1 00:29:22.422 #undef SPDK_CONFIG_UNIT_TESTS 00:29:22.422 #undef SPDK_CONFIG_URING 00:29:22.422 #define SPDK_CONFIG_URING_PATH 00:29:22.422 #undef SPDK_CONFIG_URING_ZNS 00:29:22.422 #undef SPDK_CONFIG_USDT 00:29:22.422 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:29:22.422 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:29:22.422 #undef SPDK_CONFIG_VFIO_USER 00:29:22.422 #define SPDK_CONFIG_VFIO_USER_DIR 00:29:22.422 #define SPDK_CONFIG_VHOST 1 00:29:22.422 #define SPDK_CONFIG_VIRTIO 1 00:29:22.422 #undef SPDK_CONFIG_VTUNE 00:29:22.422 #define SPDK_CONFIG_VTUNE_DIR 00:29:22.422 #define SPDK_CONFIG_WERROR 1 00:29:22.422 #define SPDK_CONFIG_WPDK_DIR 00:29:22.422 #undef SPDK_CONFIG_XNVME 00:29:22.422 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:29:22.422 10:55:57 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:22.422 10:55:57 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:22.422 10:55:57 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:22.422 10:55:57 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:22.422 10:55:57 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:22.422 10:55:57 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:22.422 10:55:57 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:22.422 10:55:57 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:29:22.422 10:55:57 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:29:22.422 10:55:57 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:29:22.422 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:29:22.423 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 2181160 ]] 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 2181160 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.Jqut9a 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:29:22.424 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.Jqut9a/tests/interrupt /tmp/spdk.Jqut9a 00:29:22.683 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88735748096 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508544000 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5772795904 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249559552 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254269952 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892328960 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901708800 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9379840 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253569536 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254274048 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=704512 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:29:22.684 * Looking for test storage... 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88735748096 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7987388416 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:22.684 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2181201 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:22.684 10:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2181201 /var/tmp/spdk.sock 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 2181201 ']' 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:22.684 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:22.684 10:55:57 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:22.684 [2024-07-12 10:55:57.688968] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:22.684 [2024-07-12 10:55:57.689044] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2181201 ] 00:29:22.684 [2024-07-12 10:55:57.819884] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:22.944 [2024-07-12 10:55:57.919659] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:22.944 [2024-07-12 10:55:57.919744] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:22.944 [2024-07-12 10:55:57.919748] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:22.944 [2024-07-12 10:55:57.994433] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:23.512 10:55:58 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:23.512 10:55:58 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:29:23.512 10:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:29:23.512 10:55:58 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:23.512 10:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:29:23.512 10:55:58 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:23.512 10:55:58 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:23.512 10:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:29:23.512 "name": "app_thread", 00:29:23.512 "id": 1, 00:29:23.512 "active_pollers": [], 00:29:23.512 "timed_pollers": [ 00:29:23.512 { 00:29:23.512 "name": "rpc_subsystem_poll_servers", 00:29:23.512 "id": 1, 00:29:23.512 "state": "waiting", 00:29:23.512 "run_count": 0, 00:29:23.512 "busy_count": 0, 00:29:23.512 "period_ticks": 9200000 00:29:23.512 } 00:29:23.512 ], 00:29:23.512 "paused_pollers": [] 00:29:23.512 }' 00:29:23.512 10:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:29:23.771 10:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:29:23.771 10:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:29:23.771 10:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:29:23.771 10:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:29:23.771 10:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:29:23.771 10:55:58 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:29:23.771 10:55:58 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:23.771 10:55:58 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:23.771 5000+0 records in 00:29:23.771 5000+0 records out 00:29:23.771 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0198547 s, 516 MB/s 00:29:23.771 10:55:58 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:24.031 AIO0 00:29:24.031 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:24.290 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:29:24.290 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:29:24.290 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:29:24.290 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:24.290 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:24.290 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:24.290 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:29:24.290 "name": "app_thread", 00:29:24.290 "id": 1, 00:29:24.290 "active_pollers": [], 00:29:24.290 "timed_pollers": [ 00:29:24.290 { 00:29:24.290 "name": "rpc_subsystem_poll_servers", 00:29:24.290 "id": 1, 00:29:24.290 "state": "waiting", 00:29:24.290 "run_count": 0, 00:29:24.290 "busy_count": 0, 00:29:24.290 "period_ticks": 9200000 00:29:24.290 } 00:29:24.290 ], 00:29:24.290 "paused_pollers": [] 00:29:24.290 }' 00:29:24.290 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:29:24.549 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:29:24.549 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:29:24.549 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:29:24.549 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:29:24.549 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:29:24.549 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:29:24.549 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 2181201 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 2181201 ']' 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 2181201 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2181201 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2181201' 00:29:24.549 killing process with pid 2181201 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 2181201 00:29:24.549 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 2181201 00:29:24.808 10:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:29:24.808 10:55:59 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:24.808 00:29:24.808 real 0m2.472s 00:29:24.808 user 0m1.578s 00:29:24.808 sys 0m0.656s 00:29:24.808 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:24.808 10:55:59 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:24.808 ************************************ 00:29:24.808 END TEST reap_unregistered_poller 00:29:24.808 ************************************ 00:29:24.808 10:55:59 -- common/autotest_common.sh@1142 -- # return 0 00:29:24.808 10:55:59 -- spdk/autotest.sh@198 -- # uname -s 00:29:24.808 10:55:59 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:29:24.808 10:55:59 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:29:24.808 10:55:59 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:29:24.808 10:55:59 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:29:24.808 10:55:59 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:29:24.808 10:55:59 -- spdk/autotest.sh@260 -- # timing_exit lib 00:29:24.808 10:55:59 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:24.808 10:55:59 -- common/autotest_common.sh@10 -- # set +x 00:29:24.808 10:55:59 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:29:24.808 10:55:59 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:29:24.808 10:55:59 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:29:24.808 10:55:59 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:29:24.808 10:55:59 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:29:24.808 10:55:59 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:29:24.808 10:55:59 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:29:24.808 10:55:59 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:29:24.809 10:55:59 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:29:24.809 10:55:59 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:29:24.809 10:55:59 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:29:24.809 10:55:59 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:29:24.809 10:55:59 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:29:24.809 10:55:59 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:24.809 10:55:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:24.809 10:55:59 -- common/autotest_common.sh@10 -- # set +x 00:29:24.809 ************************************ 00:29:24.809 START TEST compress_compdev 00:29:24.809 ************************************ 00:29:24.809 10:55:59 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:29:25.068 * Looking for test storage... 00:29:25.068 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:29:25.068 10:56:00 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:25.068 10:56:00 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:25.069 10:56:00 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:25.069 10:56:00 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:25.069 10:56:00 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:25.069 10:56:00 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:25.069 10:56:00 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:25.069 10:56:00 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:25.069 10:56:00 compress_compdev -- paths/export.sh@5 -- # export PATH 00:29:25.069 10:56:00 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:25.069 10:56:00 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:29:25.069 10:56:00 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:25.069 10:56:00 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:25.069 10:56:00 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:25.069 10:56:00 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:25.069 10:56:00 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:25.069 10:56:00 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:25.069 10:56:00 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:25.069 10:56:00 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:25.069 10:56:00 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:25.069 10:56:00 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:29:25.069 10:56:00 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:29:25.069 10:56:00 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:29:25.069 10:56:00 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:25.069 10:56:00 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2181647 00:29:25.069 10:56:00 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:25.069 10:56:00 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2181647 00:29:25.069 10:56:00 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:25.069 10:56:00 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2181647 ']' 00:29:25.069 10:56:00 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:25.069 10:56:00 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:25.069 10:56:00 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:25.069 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:25.069 10:56:00 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:25.069 10:56:00 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:25.069 [2024-07-12 10:56:00.180380] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:25.069 [2024-07-12 10:56:00.180449] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2181647 ] 00:29:25.328 [2024-07-12 10:56:00.298683] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:25.328 [2024-07-12 10:56:00.396683] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:25.328 [2024-07-12 10:56:00.396689] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:26.265 [2024-07-12 10:56:01.140914] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:26.265 10:56:01 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:26.265 10:56:01 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:26.265 10:56:01 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:29:26.265 10:56:01 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:26.265 10:56:01 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:26.852 [2024-07-12 10:56:01.787698] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17fc3c0 PMD being used: compress_qat 00:29:26.852 10:56:01 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:26.852 10:56:01 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:26.852 10:56:01 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:26.852 10:56:01 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:26.852 10:56:01 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:26.852 10:56:01 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:26.852 10:56:01 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:27.110 10:56:02 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:27.110 [ 00:29:27.110 { 00:29:27.110 "name": "Nvme0n1", 00:29:27.110 "aliases": [ 00:29:27.110 "01000000-0000-0000-5cd2-e43197705251" 00:29:27.110 ], 00:29:27.110 "product_name": "NVMe disk", 00:29:27.110 "block_size": 512, 00:29:27.110 "num_blocks": 15002931888, 00:29:27.110 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:27.110 "assigned_rate_limits": { 00:29:27.110 "rw_ios_per_sec": 0, 00:29:27.110 "rw_mbytes_per_sec": 0, 00:29:27.110 "r_mbytes_per_sec": 0, 00:29:27.110 "w_mbytes_per_sec": 0 00:29:27.110 }, 00:29:27.110 "claimed": false, 00:29:27.110 "zoned": false, 00:29:27.110 "supported_io_types": { 00:29:27.110 "read": true, 00:29:27.110 "write": true, 00:29:27.110 "unmap": true, 00:29:27.110 "flush": true, 00:29:27.110 "reset": true, 00:29:27.110 "nvme_admin": true, 00:29:27.110 "nvme_io": true, 00:29:27.110 "nvme_io_md": false, 00:29:27.110 "write_zeroes": true, 00:29:27.110 "zcopy": false, 00:29:27.110 "get_zone_info": false, 00:29:27.110 "zone_management": false, 00:29:27.110 "zone_append": false, 00:29:27.110 "compare": false, 00:29:27.110 "compare_and_write": false, 00:29:27.110 "abort": true, 00:29:27.110 "seek_hole": false, 00:29:27.110 "seek_data": false, 00:29:27.110 "copy": false, 00:29:27.110 "nvme_iov_md": false 00:29:27.110 }, 00:29:27.110 "driver_specific": { 00:29:27.110 "nvme": [ 00:29:27.110 { 00:29:27.110 "pci_address": "0000:5e:00.0", 00:29:27.110 "trid": { 00:29:27.110 "trtype": "PCIe", 00:29:27.111 "traddr": "0000:5e:00.0" 00:29:27.111 }, 00:29:27.111 "ctrlr_data": { 00:29:27.111 "cntlid": 0, 00:29:27.111 "vendor_id": "0x8086", 00:29:27.111 "model_number": "INTEL SSDPF2KX076TZO", 00:29:27.111 "serial_number": "PHAC0301002G7P6CGN", 00:29:27.111 "firmware_revision": "JCV10200", 00:29:27.111 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:27.111 "oacs": { 00:29:27.111 "security": 1, 00:29:27.111 "format": 1, 00:29:27.111 "firmware": 1, 00:29:27.111 "ns_manage": 1 00:29:27.111 }, 00:29:27.111 "multi_ctrlr": false, 00:29:27.111 "ana_reporting": false 00:29:27.111 }, 00:29:27.111 "vs": { 00:29:27.111 "nvme_version": "1.3" 00:29:27.111 }, 00:29:27.111 "ns_data": { 00:29:27.111 "id": 1, 00:29:27.111 "can_share": false 00:29:27.111 }, 00:29:27.111 "security": { 00:29:27.111 "opal": true 00:29:27.111 } 00:29:27.111 } 00:29:27.111 ], 00:29:27.111 "mp_policy": "active_passive" 00:29:27.111 } 00:29:27.111 } 00:29:27.111 ] 00:29:27.369 10:56:02 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:27.369 10:56:02 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:27.369 [2024-07-12 10:56:02.541556] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16610d0 PMD being used: compress_qat 00:29:29.961 84d424c5-fa9d-4484-b03a-ff289b8dc19f 00:29:29.961 10:56:04 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:29.961 3575db14-827d-46b1-a7b9-a6f5da0e94f9 00:29:29.961 10:56:05 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:29.961 10:56:05 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:29.961 10:56:05 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:29.961 10:56:05 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:29.961 10:56:05 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:29.961 10:56:05 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:29.961 10:56:05 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:30.217 10:56:05 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:30.217 [ 00:29:30.217 { 00:29:30.217 "name": "3575db14-827d-46b1-a7b9-a6f5da0e94f9", 00:29:30.217 "aliases": [ 00:29:30.217 "lvs0/lv0" 00:29:30.217 ], 00:29:30.217 "product_name": "Logical Volume", 00:29:30.217 "block_size": 512, 00:29:30.217 "num_blocks": 204800, 00:29:30.217 "uuid": "3575db14-827d-46b1-a7b9-a6f5da0e94f9", 00:29:30.217 "assigned_rate_limits": { 00:29:30.217 "rw_ios_per_sec": 0, 00:29:30.217 "rw_mbytes_per_sec": 0, 00:29:30.217 "r_mbytes_per_sec": 0, 00:29:30.217 "w_mbytes_per_sec": 0 00:29:30.217 }, 00:29:30.217 "claimed": false, 00:29:30.217 "zoned": false, 00:29:30.217 "supported_io_types": { 00:29:30.217 "read": true, 00:29:30.217 "write": true, 00:29:30.217 "unmap": true, 00:29:30.217 "flush": false, 00:29:30.217 "reset": true, 00:29:30.217 "nvme_admin": false, 00:29:30.217 "nvme_io": false, 00:29:30.217 "nvme_io_md": false, 00:29:30.217 "write_zeroes": true, 00:29:30.217 "zcopy": false, 00:29:30.217 "get_zone_info": false, 00:29:30.217 "zone_management": false, 00:29:30.217 "zone_append": false, 00:29:30.217 "compare": false, 00:29:30.217 "compare_and_write": false, 00:29:30.217 "abort": false, 00:29:30.217 "seek_hole": true, 00:29:30.217 "seek_data": true, 00:29:30.217 "copy": false, 00:29:30.217 "nvme_iov_md": false 00:29:30.217 }, 00:29:30.217 "driver_specific": { 00:29:30.217 "lvol": { 00:29:30.217 "lvol_store_uuid": "84d424c5-fa9d-4484-b03a-ff289b8dc19f", 00:29:30.217 "base_bdev": "Nvme0n1", 00:29:30.217 "thin_provision": true, 00:29:30.217 "num_allocated_clusters": 0, 00:29:30.217 "snapshot": false, 00:29:30.217 "clone": false, 00:29:30.217 "esnap_clone": false 00:29:30.217 } 00:29:30.217 } 00:29:30.217 } 00:29:30.217 ] 00:29:30.475 10:56:05 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:30.475 10:56:05 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:30.475 10:56:05 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:30.475 [2024-07-12 10:56:05.659683] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:30.475 COMP_lvs0/lv0 00:29:30.732 10:56:05 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:30.732 10:56:05 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:30.732 10:56:05 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:30.732 10:56:05 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:30.732 10:56:05 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:30.732 10:56:05 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:30.732 10:56:05 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:30.732 10:56:05 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:30.990 [ 00:29:30.990 { 00:29:30.990 "name": "COMP_lvs0/lv0", 00:29:30.990 "aliases": [ 00:29:30.990 "5412552e-17d4-5da3-ab73-1ae9084be4a3" 00:29:30.990 ], 00:29:30.990 "product_name": "compress", 00:29:30.990 "block_size": 512, 00:29:30.990 "num_blocks": 200704, 00:29:30.990 "uuid": "5412552e-17d4-5da3-ab73-1ae9084be4a3", 00:29:30.990 "assigned_rate_limits": { 00:29:30.990 "rw_ios_per_sec": 0, 00:29:30.990 "rw_mbytes_per_sec": 0, 00:29:30.990 "r_mbytes_per_sec": 0, 00:29:30.990 "w_mbytes_per_sec": 0 00:29:30.990 }, 00:29:30.990 "claimed": false, 00:29:30.990 "zoned": false, 00:29:30.990 "supported_io_types": { 00:29:30.990 "read": true, 00:29:30.990 "write": true, 00:29:30.990 "unmap": false, 00:29:30.990 "flush": false, 00:29:30.990 "reset": false, 00:29:30.990 "nvme_admin": false, 00:29:30.990 "nvme_io": false, 00:29:30.990 "nvme_io_md": false, 00:29:30.990 "write_zeroes": true, 00:29:30.990 "zcopy": false, 00:29:30.990 "get_zone_info": false, 00:29:30.990 "zone_management": false, 00:29:30.990 "zone_append": false, 00:29:30.990 "compare": false, 00:29:30.990 "compare_and_write": false, 00:29:30.990 "abort": false, 00:29:30.990 "seek_hole": false, 00:29:30.990 "seek_data": false, 00:29:30.990 "copy": false, 00:29:30.990 "nvme_iov_md": false 00:29:30.990 }, 00:29:30.990 "driver_specific": { 00:29:30.990 "compress": { 00:29:30.990 "name": "COMP_lvs0/lv0", 00:29:30.990 "base_bdev_name": "3575db14-827d-46b1-a7b9-a6f5da0e94f9" 00:29:30.990 } 00:29:30.990 } 00:29:30.990 } 00:29:30.990 ] 00:29:30.990 10:56:06 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:30.990 10:56:06 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:31.248 [2024-07-12 10:56:06.278318] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f5dec1b15c0 PMD being used: compress_qat 00:29:31.248 [2024-07-12 10:56:06.280625] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17f9670 PMD being used: compress_qat 00:29:31.248 Running I/O for 3 seconds... 00:29:34.529 00:29:34.529 Latency(us) 00:29:34.529 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:34.529 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:34.529 Verification LBA range: start 0x0 length 0x3100 00:29:34.529 COMP_lvs0/lv0 : 3.00 5152.05 20.13 0.00 0.00 6160.77 480.83 5641.79 00:29:34.529 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:34.529 Verification LBA range: start 0x3100 length 0x3100 00:29:34.529 COMP_lvs0/lv0 : 3.00 5422.36 21.18 0.00 0.00 5866.04 343.71 5584.81 00:29:34.529 =================================================================================================================== 00:29:34.529 Total : 10574.40 41.31 0.00 0.00 6009.64 343.71 5641.79 00:29:34.529 0 00:29:34.529 10:56:09 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:34.529 10:56:09 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:34.529 10:56:09 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:34.787 10:56:09 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:34.787 10:56:09 compress_compdev -- compress/compress.sh@78 -- # killprocess 2181647 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2181647 ']' 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2181647 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2181647 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2181647' 00:29:34.787 killing process with pid 2181647 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@967 -- # kill 2181647 00:29:34.787 Received shutdown signal, test time was about 3.000000 seconds 00:29:34.787 00:29:34.787 Latency(us) 00:29:34.787 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:34.787 =================================================================================================================== 00:29:34.787 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:34.787 10:56:09 compress_compdev -- common/autotest_common.sh@972 -- # wait 2181647 00:29:38.071 10:56:12 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:29:38.071 10:56:12 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:38.071 10:56:12 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2183246 00:29:38.071 10:56:12 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:38.071 10:56:12 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:38.071 10:56:12 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2183246 00:29:38.071 10:56:12 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2183246 ']' 00:29:38.071 10:56:12 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:38.071 10:56:12 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:38.071 10:56:12 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:38.071 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:38.071 10:56:12 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:38.071 10:56:12 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:38.071 [2024-07-12 10:56:12.900272] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:38.071 [2024-07-12 10:56:12.900343] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2183246 ] 00:29:38.071 [2024-07-12 10:56:13.020914] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:38.071 [2024-07-12 10:56:13.127193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:38.071 [2024-07-12 10:56:13.127200] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:39.004 [2024-07-12 10:56:13.895583] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:39.004 10:56:13 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:39.004 10:56:13 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:39.004 10:56:13 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:29:39.004 10:56:13 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:39.004 10:56:13 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:39.570 [2024-07-12 10:56:14.539514] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21bb3c0 PMD being used: compress_qat 00:29:39.570 10:56:14 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:39.570 10:56:14 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:39.570 10:56:14 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:39.570 10:56:14 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:39.570 10:56:14 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:39.570 10:56:14 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:39.570 10:56:14 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:39.828 10:56:14 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:40.086 [ 00:29:40.086 { 00:29:40.086 "name": "Nvme0n1", 00:29:40.086 "aliases": [ 00:29:40.086 "01000000-0000-0000-5cd2-e43197705251" 00:29:40.086 ], 00:29:40.086 "product_name": "NVMe disk", 00:29:40.086 "block_size": 512, 00:29:40.086 "num_blocks": 15002931888, 00:29:40.086 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:40.086 "assigned_rate_limits": { 00:29:40.086 "rw_ios_per_sec": 0, 00:29:40.086 "rw_mbytes_per_sec": 0, 00:29:40.086 "r_mbytes_per_sec": 0, 00:29:40.086 "w_mbytes_per_sec": 0 00:29:40.086 }, 00:29:40.086 "claimed": false, 00:29:40.086 "zoned": false, 00:29:40.086 "supported_io_types": { 00:29:40.086 "read": true, 00:29:40.086 "write": true, 00:29:40.086 "unmap": true, 00:29:40.086 "flush": true, 00:29:40.086 "reset": true, 00:29:40.086 "nvme_admin": true, 00:29:40.086 "nvme_io": true, 00:29:40.086 "nvme_io_md": false, 00:29:40.086 "write_zeroes": true, 00:29:40.086 "zcopy": false, 00:29:40.086 "get_zone_info": false, 00:29:40.086 "zone_management": false, 00:29:40.086 "zone_append": false, 00:29:40.086 "compare": false, 00:29:40.086 "compare_and_write": false, 00:29:40.086 "abort": true, 00:29:40.086 "seek_hole": false, 00:29:40.086 "seek_data": false, 00:29:40.086 "copy": false, 00:29:40.086 "nvme_iov_md": false 00:29:40.086 }, 00:29:40.086 "driver_specific": { 00:29:40.086 "nvme": [ 00:29:40.086 { 00:29:40.086 "pci_address": "0000:5e:00.0", 00:29:40.086 "trid": { 00:29:40.086 "trtype": "PCIe", 00:29:40.086 "traddr": "0000:5e:00.0" 00:29:40.086 }, 00:29:40.086 "ctrlr_data": { 00:29:40.086 "cntlid": 0, 00:29:40.086 "vendor_id": "0x8086", 00:29:40.086 "model_number": "INTEL SSDPF2KX076TZO", 00:29:40.086 "serial_number": "PHAC0301002G7P6CGN", 00:29:40.086 "firmware_revision": "JCV10200", 00:29:40.086 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:40.086 "oacs": { 00:29:40.086 "security": 1, 00:29:40.086 "format": 1, 00:29:40.086 "firmware": 1, 00:29:40.086 "ns_manage": 1 00:29:40.086 }, 00:29:40.086 "multi_ctrlr": false, 00:29:40.086 "ana_reporting": false 00:29:40.086 }, 00:29:40.086 "vs": { 00:29:40.086 "nvme_version": "1.3" 00:29:40.086 }, 00:29:40.086 "ns_data": { 00:29:40.086 "id": 1, 00:29:40.086 "can_share": false 00:29:40.086 }, 00:29:40.086 "security": { 00:29:40.086 "opal": true 00:29:40.086 } 00:29:40.086 } 00:29:40.086 ], 00:29:40.086 "mp_policy": "active_passive" 00:29:40.086 } 00:29:40.086 } 00:29:40.086 ] 00:29:40.086 10:56:15 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:40.086 10:56:15 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:40.345 [2024-07-12 10:56:15.289164] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2020660 PMD being used: compress_qat 00:29:42.875 fe020eb2-2c4f-46d1-b560-9ebe792a57ed 00:29:42.875 10:56:17 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:42.875 85d18cf7-1169-4d59-913e-a78757ade266 00:29:42.875 10:56:17 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:42.875 10:56:17 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:42.875 10:56:17 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:42.875 10:56:17 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:42.875 10:56:17 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:42.875 10:56:17 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:42.875 10:56:17 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:42.875 10:56:17 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:43.133 [ 00:29:43.133 { 00:29:43.133 "name": "85d18cf7-1169-4d59-913e-a78757ade266", 00:29:43.133 "aliases": [ 00:29:43.133 "lvs0/lv0" 00:29:43.133 ], 00:29:43.133 "product_name": "Logical Volume", 00:29:43.133 "block_size": 512, 00:29:43.133 "num_blocks": 204800, 00:29:43.133 "uuid": "85d18cf7-1169-4d59-913e-a78757ade266", 00:29:43.133 "assigned_rate_limits": { 00:29:43.133 "rw_ios_per_sec": 0, 00:29:43.133 "rw_mbytes_per_sec": 0, 00:29:43.133 "r_mbytes_per_sec": 0, 00:29:43.133 "w_mbytes_per_sec": 0 00:29:43.133 }, 00:29:43.133 "claimed": false, 00:29:43.133 "zoned": false, 00:29:43.133 "supported_io_types": { 00:29:43.133 "read": true, 00:29:43.133 "write": true, 00:29:43.133 "unmap": true, 00:29:43.133 "flush": false, 00:29:43.133 "reset": true, 00:29:43.133 "nvme_admin": false, 00:29:43.133 "nvme_io": false, 00:29:43.133 "nvme_io_md": false, 00:29:43.133 "write_zeroes": true, 00:29:43.133 "zcopy": false, 00:29:43.133 "get_zone_info": false, 00:29:43.133 "zone_management": false, 00:29:43.133 "zone_append": false, 00:29:43.133 "compare": false, 00:29:43.133 "compare_and_write": false, 00:29:43.133 "abort": false, 00:29:43.133 "seek_hole": true, 00:29:43.133 "seek_data": true, 00:29:43.133 "copy": false, 00:29:43.133 "nvme_iov_md": false 00:29:43.133 }, 00:29:43.133 "driver_specific": { 00:29:43.133 "lvol": { 00:29:43.133 "lvol_store_uuid": "fe020eb2-2c4f-46d1-b560-9ebe792a57ed", 00:29:43.133 "base_bdev": "Nvme0n1", 00:29:43.133 "thin_provision": true, 00:29:43.133 "num_allocated_clusters": 0, 00:29:43.133 "snapshot": false, 00:29:43.133 "clone": false, 00:29:43.134 "esnap_clone": false 00:29:43.134 } 00:29:43.134 } 00:29:43.134 } 00:29:43.134 ] 00:29:43.134 10:56:18 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:43.134 10:56:18 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:29:43.134 10:56:18 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:29:43.391 [2024-07-12 10:56:18.415555] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:43.391 COMP_lvs0/lv0 00:29:43.391 10:56:18 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:43.391 10:56:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:43.391 10:56:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:43.391 10:56:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:43.391 10:56:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:43.391 10:56:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:43.391 10:56:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:43.648 10:56:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:43.906 [ 00:29:43.906 { 00:29:43.906 "name": "COMP_lvs0/lv0", 00:29:43.906 "aliases": [ 00:29:43.906 "c70c65a5-238a-52c9-8f47-7edaa10f39e1" 00:29:43.906 ], 00:29:43.906 "product_name": "compress", 00:29:43.906 "block_size": 512, 00:29:43.906 "num_blocks": 200704, 00:29:43.906 "uuid": "c70c65a5-238a-52c9-8f47-7edaa10f39e1", 00:29:43.906 "assigned_rate_limits": { 00:29:43.906 "rw_ios_per_sec": 0, 00:29:43.906 "rw_mbytes_per_sec": 0, 00:29:43.906 "r_mbytes_per_sec": 0, 00:29:43.906 "w_mbytes_per_sec": 0 00:29:43.906 }, 00:29:43.906 "claimed": false, 00:29:43.906 "zoned": false, 00:29:43.906 "supported_io_types": { 00:29:43.906 "read": true, 00:29:43.906 "write": true, 00:29:43.906 "unmap": false, 00:29:43.906 "flush": false, 00:29:43.906 "reset": false, 00:29:43.906 "nvme_admin": false, 00:29:43.906 "nvme_io": false, 00:29:43.906 "nvme_io_md": false, 00:29:43.906 "write_zeroes": true, 00:29:43.906 "zcopy": false, 00:29:43.906 "get_zone_info": false, 00:29:43.906 "zone_management": false, 00:29:43.906 "zone_append": false, 00:29:43.906 "compare": false, 00:29:43.906 "compare_and_write": false, 00:29:43.906 "abort": false, 00:29:43.906 "seek_hole": false, 00:29:43.906 "seek_data": false, 00:29:43.906 "copy": false, 00:29:43.906 "nvme_iov_md": false 00:29:43.906 }, 00:29:43.906 "driver_specific": { 00:29:43.906 "compress": { 00:29:43.906 "name": "COMP_lvs0/lv0", 00:29:43.906 "base_bdev_name": "85d18cf7-1169-4d59-913e-a78757ade266" 00:29:43.906 } 00:29:43.906 } 00:29:43.906 } 00:29:43.906 ] 00:29:43.906 10:56:18 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:43.906 10:56:18 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:43.906 [2024-07-12 10:56:19.033945] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fdb901b15c0 PMD being used: compress_qat 00:29:43.906 [2024-07-12 10:56:19.036149] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21b8770 PMD being used: compress_qat 00:29:43.906 Running I/O for 3 seconds... 00:29:47.187 00:29:47.187 Latency(us) 00:29:47.187 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:47.187 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:47.187 Verification LBA range: start 0x0 length 0x3100 00:29:47.187 COMP_lvs0/lv0 : 3.00 5172.03 20.20 0.00 0.00 6135.30 548.51 5869.75 00:29:47.187 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:47.187 Verification LBA range: start 0x3100 length 0x3100 00:29:47.187 COMP_lvs0/lv0 : 3.00 5445.85 21.27 0.00 0.00 5840.69 395.35 5413.84 00:29:47.187 =================================================================================================================== 00:29:47.187 Total : 10617.88 41.48 0.00 0.00 5984.21 395.35 5869.75 00:29:47.187 0 00:29:47.187 10:56:22 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:47.187 10:56:22 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:47.187 10:56:22 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:47.445 10:56:22 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:47.445 10:56:22 compress_compdev -- compress/compress.sh@78 -- # killprocess 2183246 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2183246 ']' 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2183246 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2183246 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2183246' 00:29:47.445 killing process with pid 2183246 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@967 -- # kill 2183246 00:29:47.445 Received shutdown signal, test time was about 3.000000 seconds 00:29:47.445 00:29:47.445 Latency(us) 00:29:47.445 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:47.445 =================================================================================================================== 00:29:47.445 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:47.445 10:56:22 compress_compdev -- common/autotest_common.sh@972 -- # wait 2183246 00:29:50.737 10:56:25 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:29:50.737 10:56:25 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:50.737 10:56:25 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2184851 00:29:50.737 10:56:25 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:50.737 10:56:25 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:50.737 10:56:25 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2184851 00:29:50.737 10:56:25 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2184851 ']' 00:29:50.737 10:56:25 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:50.737 10:56:25 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:50.737 10:56:25 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:50.737 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:50.737 10:56:25 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:50.737 10:56:25 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:50.737 [2024-07-12 10:56:25.418365] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:29:50.737 [2024-07-12 10:56:25.418444] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2184851 ] 00:29:50.737 [2024-07-12 10:56:25.537643] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:50.737 [2024-07-12 10:56:25.641324] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:50.737 [2024-07-12 10:56:25.641330] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:51.304 [2024-07-12 10:56:26.386414] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:51.304 10:56:26 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:51.304 10:56:26 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:51.304 10:56:26 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:29:51.304 10:56:26 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:51.304 10:56:26 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:51.870 [2024-07-12 10:56:27.027603] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x28b52b0 PMD being used: compress_qat 00:29:51.870 10:56:27 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:51.870 10:56:27 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:51.870 10:56:27 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:51.870 10:56:27 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:51.870 10:56:27 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:51.870 10:56:27 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:51.870 10:56:27 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:52.128 10:56:27 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:52.386 [ 00:29:52.386 { 00:29:52.386 "name": "Nvme0n1", 00:29:52.386 "aliases": [ 00:29:52.386 "01000000-0000-0000-5cd2-e43197705251" 00:29:52.386 ], 00:29:52.386 "product_name": "NVMe disk", 00:29:52.387 "block_size": 512, 00:29:52.387 "num_blocks": 15002931888, 00:29:52.387 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:52.387 "assigned_rate_limits": { 00:29:52.387 "rw_ios_per_sec": 0, 00:29:52.387 "rw_mbytes_per_sec": 0, 00:29:52.387 "r_mbytes_per_sec": 0, 00:29:52.387 "w_mbytes_per_sec": 0 00:29:52.387 }, 00:29:52.387 "claimed": false, 00:29:52.387 "zoned": false, 00:29:52.387 "supported_io_types": { 00:29:52.387 "read": true, 00:29:52.387 "write": true, 00:29:52.387 "unmap": true, 00:29:52.387 "flush": true, 00:29:52.387 "reset": true, 00:29:52.387 "nvme_admin": true, 00:29:52.387 "nvme_io": true, 00:29:52.387 "nvme_io_md": false, 00:29:52.387 "write_zeroes": true, 00:29:52.387 "zcopy": false, 00:29:52.387 "get_zone_info": false, 00:29:52.387 "zone_management": false, 00:29:52.387 "zone_append": false, 00:29:52.387 "compare": false, 00:29:52.387 "compare_and_write": false, 00:29:52.387 "abort": true, 00:29:52.387 "seek_hole": false, 00:29:52.387 "seek_data": false, 00:29:52.387 "copy": false, 00:29:52.387 "nvme_iov_md": false 00:29:52.387 }, 00:29:52.387 "driver_specific": { 00:29:52.387 "nvme": [ 00:29:52.387 { 00:29:52.387 "pci_address": "0000:5e:00.0", 00:29:52.387 "trid": { 00:29:52.387 "trtype": "PCIe", 00:29:52.387 "traddr": "0000:5e:00.0" 00:29:52.387 }, 00:29:52.387 "ctrlr_data": { 00:29:52.387 "cntlid": 0, 00:29:52.387 "vendor_id": "0x8086", 00:29:52.387 "model_number": "INTEL SSDPF2KX076TZO", 00:29:52.387 "serial_number": "PHAC0301002G7P6CGN", 00:29:52.387 "firmware_revision": "JCV10200", 00:29:52.387 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:52.387 "oacs": { 00:29:52.387 "security": 1, 00:29:52.387 "format": 1, 00:29:52.387 "firmware": 1, 00:29:52.387 "ns_manage": 1 00:29:52.387 }, 00:29:52.387 "multi_ctrlr": false, 00:29:52.387 "ana_reporting": false 00:29:52.387 }, 00:29:52.387 "vs": { 00:29:52.387 "nvme_version": "1.3" 00:29:52.387 }, 00:29:52.387 "ns_data": { 00:29:52.387 "id": 1, 00:29:52.387 "can_share": false 00:29:52.387 }, 00:29:52.387 "security": { 00:29:52.387 "opal": false 00:29:52.387 } 00:29:52.387 } 00:29:52.387 ], 00:29:52.387 "mp_policy": "active_passive" 00:29:52.387 } 00:29:52.387 } 00:29:52.387 ] 00:29:52.387 10:56:27 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:52.387 10:56:27 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:52.683 [2024-07-12 10:56:27.765232] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2703370 PMD being used: compress_qat 00:29:55.229 153d8b79-6b0e-4470-9c50-7eb4767e4fe5 00:29:55.229 10:56:29 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:55.229 ed70af85-eea3-4337-ad9f-9eab8fee6731 00:29:55.229 10:56:30 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:55.229 10:56:30 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:55.229 10:56:30 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:55.229 10:56:30 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:55.229 10:56:30 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:55.229 10:56:30 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:55.229 10:56:30 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:55.488 10:56:30 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:55.746 [ 00:29:55.746 { 00:29:55.746 "name": "ed70af85-eea3-4337-ad9f-9eab8fee6731", 00:29:55.746 "aliases": [ 00:29:55.746 "lvs0/lv0" 00:29:55.746 ], 00:29:55.746 "product_name": "Logical Volume", 00:29:55.746 "block_size": 512, 00:29:55.746 "num_blocks": 204800, 00:29:55.746 "uuid": "ed70af85-eea3-4337-ad9f-9eab8fee6731", 00:29:55.746 "assigned_rate_limits": { 00:29:55.746 "rw_ios_per_sec": 0, 00:29:55.746 "rw_mbytes_per_sec": 0, 00:29:55.746 "r_mbytes_per_sec": 0, 00:29:55.746 "w_mbytes_per_sec": 0 00:29:55.746 }, 00:29:55.746 "claimed": false, 00:29:55.746 "zoned": false, 00:29:55.746 "supported_io_types": { 00:29:55.746 "read": true, 00:29:55.746 "write": true, 00:29:55.746 "unmap": true, 00:29:55.746 "flush": false, 00:29:55.746 "reset": true, 00:29:55.746 "nvme_admin": false, 00:29:55.746 "nvme_io": false, 00:29:55.746 "nvme_io_md": false, 00:29:55.746 "write_zeroes": true, 00:29:55.746 "zcopy": false, 00:29:55.746 "get_zone_info": false, 00:29:55.746 "zone_management": false, 00:29:55.746 "zone_append": false, 00:29:55.746 "compare": false, 00:29:55.746 "compare_and_write": false, 00:29:55.746 "abort": false, 00:29:55.746 "seek_hole": true, 00:29:55.746 "seek_data": true, 00:29:55.746 "copy": false, 00:29:55.746 "nvme_iov_md": false 00:29:55.746 }, 00:29:55.746 "driver_specific": { 00:29:55.746 "lvol": { 00:29:55.746 "lvol_store_uuid": "153d8b79-6b0e-4470-9c50-7eb4767e4fe5", 00:29:55.746 "base_bdev": "Nvme0n1", 00:29:55.746 "thin_provision": true, 00:29:55.746 "num_allocated_clusters": 0, 00:29:55.746 "snapshot": false, 00:29:55.746 "clone": false, 00:29:55.746 "esnap_clone": false 00:29:55.746 } 00:29:55.746 } 00:29:55.746 } 00:29:55.746 ] 00:29:55.746 10:56:30 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:55.746 10:56:30 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:29:55.747 10:56:30 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:29:56.005 [2024-07-12 10:56:30.963821] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:56.005 COMP_lvs0/lv0 00:29:56.005 10:56:30 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:56.005 10:56:30 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:56.005 10:56:30 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:56.005 10:56:30 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:56.005 10:56:30 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:56.005 10:56:30 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:56.005 10:56:30 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:56.263 10:56:31 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:56.263 [ 00:29:56.263 { 00:29:56.263 "name": "COMP_lvs0/lv0", 00:29:56.263 "aliases": [ 00:29:56.264 "b3593c69-25a5-5d91-ac30-60d4ef4ec5b5" 00:29:56.264 ], 00:29:56.264 "product_name": "compress", 00:29:56.264 "block_size": 4096, 00:29:56.264 "num_blocks": 25088, 00:29:56.264 "uuid": "b3593c69-25a5-5d91-ac30-60d4ef4ec5b5", 00:29:56.264 "assigned_rate_limits": { 00:29:56.264 "rw_ios_per_sec": 0, 00:29:56.264 "rw_mbytes_per_sec": 0, 00:29:56.264 "r_mbytes_per_sec": 0, 00:29:56.264 "w_mbytes_per_sec": 0 00:29:56.264 }, 00:29:56.264 "claimed": false, 00:29:56.264 "zoned": false, 00:29:56.264 "supported_io_types": { 00:29:56.264 "read": true, 00:29:56.264 "write": true, 00:29:56.264 "unmap": false, 00:29:56.264 "flush": false, 00:29:56.264 "reset": false, 00:29:56.264 "nvme_admin": false, 00:29:56.264 "nvme_io": false, 00:29:56.264 "nvme_io_md": false, 00:29:56.264 "write_zeroes": true, 00:29:56.264 "zcopy": false, 00:29:56.264 "get_zone_info": false, 00:29:56.264 "zone_management": false, 00:29:56.264 "zone_append": false, 00:29:56.264 "compare": false, 00:29:56.264 "compare_and_write": false, 00:29:56.264 "abort": false, 00:29:56.264 "seek_hole": false, 00:29:56.264 "seek_data": false, 00:29:56.264 "copy": false, 00:29:56.264 "nvme_iov_md": false 00:29:56.264 }, 00:29:56.264 "driver_specific": { 00:29:56.264 "compress": { 00:29:56.264 "name": "COMP_lvs0/lv0", 00:29:56.264 "base_bdev_name": "ed70af85-eea3-4337-ad9f-9eab8fee6731" 00:29:56.264 } 00:29:56.264 } 00:29:56.264 } 00:29:56.264 ] 00:29:56.264 10:56:31 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:56.264 10:56:31 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:56.522 [2024-07-12 10:56:31.530019] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f53281b15c0 PMD being used: compress_qat 00:29:56.522 [2024-07-12 10:56:31.532235] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2795360 PMD being used: compress_qat 00:29:56.522 Running I/O for 3 seconds... 00:29:59.803 00:29:59.803 Latency(us) 00:29:59.803 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:59.803 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:59.803 Verification LBA range: start 0x0 length 0x3100 00:29:59.803 COMP_lvs0/lv0 : 3.00 5134.54 20.06 0.00 0.00 6180.57 398.91 5926.73 00:29:59.803 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:59.803 Verification LBA range: start 0x3100 length 0x3100 00:29:59.803 COMP_lvs0/lv0 : 3.00 5384.83 21.03 0.00 0.00 5906.04 336.58 5613.30 00:29:59.803 =================================================================================================================== 00:29:59.803 Total : 10519.37 41.09 0.00 0.00 6040.05 336.58 5926.73 00:29:59.803 0 00:29:59.803 10:56:34 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:59.803 10:56:34 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:59.803 10:56:34 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:00.060 10:56:35 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:00.060 10:56:35 compress_compdev -- compress/compress.sh@78 -- # killprocess 2184851 00:30:00.060 10:56:35 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2184851 ']' 00:30:00.060 10:56:35 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2184851 00:30:00.060 10:56:35 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:00.060 10:56:35 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:00.060 10:56:35 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2184851 00:30:00.060 10:56:35 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:00.060 10:56:35 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:00.060 10:56:35 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2184851' 00:30:00.060 killing process with pid 2184851 00:30:00.060 10:56:35 compress_compdev -- common/autotest_common.sh@967 -- # kill 2184851 00:30:00.060 Received shutdown signal, test time was about 3.000000 seconds 00:30:00.060 00:30:00.061 Latency(us) 00:30:00.061 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:00.061 =================================================================================================================== 00:30:00.061 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:00.061 10:56:35 compress_compdev -- common/autotest_common.sh@972 -- # wait 2184851 00:30:03.344 10:56:38 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:30:03.344 10:56:38 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:03.344 10:56:38 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=2186460 00:30:03.344 10:56:38 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:03.344 10:56:38 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:30:03.344 10:56:38 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 2186460 00:30:03.344 10:56:38 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2186460 ']' 00:30:03.344 10:56:38 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:03.344 10:56:38 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:03.344 10:56:38 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:03.344 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:03.344 10:56:38 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:03.344 10:56:38 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:03.344 [2024-07-12 10:56:38.172664] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:30:03.344 [2024-07-12 10:56:38.172736] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2186460 ] 00:30:03.344 [2024-07-12 10:56:38.302007] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:03.344 [2024-07-12 10:56:38.408722] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:03.344 [2024-07-12 10:56:38.408809] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:03.344 [2024-07-12 10:56:38.408813] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:04.278 [2024-07-12 10:56:39.150603] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:04.278 10:56:39 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:04.278 10:56:39 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:04.278 10:56:39 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:30:04.278 10:56:39 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:04.278 10:56:39 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:04.845 [2024-07-12 10:56:39.780187] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2901f20 PMD being used: compress_qat 00:30:04.845 10:56:39 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:04.845 10:56:39 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:04.845 10:56:39 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:04.845 10:56:39 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:04.845 10:56:39 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:04.845 10:56:39 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:04.845 10:56:39 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:05.103 10:56:40 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:05.103 [ 00:30:05.103 { 00:30:05.103 "name": "Nvme0n1", 00:30:05.103 "aliases": [ 00:30:05.103 "01000000-0000-0000-5cd2-e43197705251" 00:30:05.103 ], 00:30:05.103 "product_name": "NVMe disk", 00:30:05.103 "block_size": 512, 00:30:05.103 "num_blocks": 15002931888, 00:30:05.103 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:05.103 "assigned_rate_limits": { 00:30:05.103 "rw_ios_per_sec": 0, 00:30:05.103 "rw_mbytes_per_sec": 0, 00:30:05.103 "r_mbytes_per_sec": 0, 00:30:05.103 "w_mbytes_per_sec": 0 00:30:05.103 }, 00:30:05.103 "claimed": false, 00:30:05.103 "zoned": false, 00:30:05.103 "supported_io_types": { 00:30:05.103 "read": true, 00:30:05.103 "write": true, 00:30:05.103 "unmap": true, 00:30:05.103 "flush": true, 00:30:05.103 "reset": true, 00:30:05.103 "nvme_admin": true, 00:30:05.103 "nvme_io": true, 00:30:05.103 "nvme_io_md": false, 00:30:05.103 "write_zeroes": true, 00:30:05.103 "zcopy": false, 00:30:05.103 "get_zone_info": false, 00:30:05.103 "zone_management": false, 00:30:05.103 "zone_append": false, 00:30:05.103 "compare": false, 00:30:05.103 "compare_and_write": false, 00:30:05.103 "abort": true, 00:30:05.103 "seek_hole": false, 00:30:05.103 "seek_data": false, 00:30:05.103 "copy": false, 00:30:05.103 "nvme_iov_md": false 00:30:05.103 }, 00:30:05.103 "driver_specific": { 00:30:05.103 "nvme": [ 00:30:05.103 { 00:30:05.103 "pci_address": "0000:5e:00.0", 00:30:05.103 "trid": { 00:30:05.103 "trtype": "PCIe", 00:30:05.103 "traddr": "0000:5e:00.0" 00:30:05.103 }, 00:30:05.103 "ctrlr_data": { 00:30:05.103 "cntlid": 0, 00:30:05.103 "vendor_id": "0x8086", 00:30:05.103 "model_number": "INTEL SSDPF2KX076TZO", 00:30:05.103 "serial_number": "PHAC0301002G7P6CGN", 00:30:05.103 "firmware_revision": "JCV10200", 00:30:05.103 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:05.103 "oacs": { 00:30:05.103 "security": 1, 00:30:05.103 "format": 1, 00:30:05.103 "firmware": 1, 00:30:05.103 "ns_manage": 1 00:30:05.103 }, 00:30:05.103 "multi_ctrlr": false, 00:30:05.103 "ana_reporting": false 00:30:05.103 }, 00:30:05.103 "vs": { 00:30:05.103 "nvme_version": "1.3" 00:30:05.103 }, 00:30:05.103 "ns_data": { 00:30:05.103 "id": 1, 00:30:05.103 "can_share": false 00:30:05.103 }, 00:30:05.103 "security": { 00:30:05.103 "opal": true 00:30:05.103 } 00:30:05.103 } 00:30:05.103 ], 00:30:05.103 "mp_policy": "active_passive" 00:30:05.103 } 00:30:05.103 } 00:30:05.103 ] 00:30:05.362 10:56:40 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:05.362 10:56:40 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:05.362 [2024-07-12 10:56:40.537800] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2750440 PMD being used: compress_qat 00:30:07.889 aed78b17-0c04-48bb-9daa-328ce8d119a4 00:30:07.889 10:56:42 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:07.889 d94e99ba-af7d-4a9c-a2df-cd1ceb53dc27 00:30:07.889 10:56:43 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:07.889 10:56:43 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:07.889 10:56:43 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:07.889 10:56:43 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:07.889 10:56:43 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:07.889 10:56:43 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:07.889 10:56:43 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:08.146 10:56:43 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:08.403 [ 00:30:08.404 { 00:30:08.404 "name": "d94e99ba-af7d-4a9c-a2df-cd1ceb53dc27", 00:30:08.404 "aliases": [ 00:30:08.404 "lvs0/lv0" 00:30:08.404 ], 00:30:08.404 "product_name": "Logical Volume", 00:30:08.404 "block_size": 512, 00:30:08.404 "num_blocks": 204800, 00:30:08.404 "uuid": "d94e99ba-af7d-4a9c-a2df-cd1ceb53dc27", 00:30:08.404 "assigned_rate_limits": { 00:30:08.404 "rw_ios_per_sec": 0, 00:30:08.404 "rw_mbytes_per_sec": 0, 00:30:08.404 "r_mbytes_per_sec": 0, 00:30:08.404 "w_mbytes_per_sec": 0 00:30:08.404 }, 00:30:08.404 "claimed": false, 00:30:08.404 "zoned": false, 00:30:08.404 "supported_io_types": { 00:30:08.404 "read": true, 00:30:08.404 "write": true, 00:30:08.404 "unmap": true, 00:30:08.404 "flush": false, 00:30:08.404 "reset": true, 00:30:08.404 "nvme_admin": false, 00:30:08.404 "nvme_io": false, 00:30:08.404 "nvme_io_md": false, 00:30:08.404 "write_zeroes": true, 00:30:08.404 "zcopy": false, 00:30:08.404 "get_zone_info": false, 00:30:08.404 "zone_management": false, 00:30:08.404 "zone_append": false, 00:30:08.404 "compare": false, 00:30:08.404 "compare_and_write": false, 00:30:08.404 "abort": false, 00:30:08.404 "seek_hole": true, 00:30:08.404 "seek_data": true, 00:30:08.404 "copy": false, 00:30:08.404 "nvme_iov_md": false 00:30:08.404 }, 00:30:08.404 "driver_specific": { 00:30:08.404 "lvol": { 00:30:08.404 "lvol_store_uuid": "aed78b17-0c04-48bb-9daa-328ce8d119a4", 00:30:08.404 "base_bdev": "Nvme0n1", 00:30:08.404 "thin_provision": true, 00:30:08.404 "num_allocated_clusters": 0, 00:30:08.404 "snapshot": false, 00:30:08.404 "clone": false, 00:30:08.404 "esnap_clone": false 00:30:08.404 } 00:30:08.404 } 00:30:08.404 } 00:30:08.404 ] 00:30:08.404 10:56:43 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:08.404 10:56:43 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:08.404 10:56:43 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:08.661 [2024-07-12 10:56:43.737513] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:08.661 COMP_lvs0/lv0 00:30:08.661 10:56:43 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:08.662 10:56:43 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:08.662 10:56:43 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:08.662 10:56:43 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:08.662 10:56:43 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:08.662 10:56:43 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:08.662 10:56:43 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:08.919 10:56:44 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:09.177 [ 00:30:09.177 { 00:30:09.177 "name": "COMP_lvs0/lv0", 00:30:09.177 "aliases": [ 00:30:09.177 "aed65084-e129-53db-a27b-ce424e77e944" 00:30:09.177 ], 00:30:09.177 "product_name": "compress", 00:30:09.177 "block_size": 512, 00:30:09.177 "num_blocks": 200704, 00:30:09.177 "uuid": "aed65084-e129-53db-a27b-ce424e77e944", 00:30:09.177 "assigned_rate_limits": { 00:30:09.177 "rw_ios_per_sec": 0, 00:30:09.177 "rw_mbytes_per_sec": 0, 00:30:09.177 "r_mbytes_per_sec": 0, 00:30:09.177 "w_mbytes_per_sec": 0 00:30:09.177 }, 00:30:09.177 "claimed": false, 00:30:09.177 "zoned": false, 00:30:09.177 "supported_io_types": { 00:30:09.177 "read": true, 00:30:09.177 "write": true, 00:30:09.177 "unmap": false, 00:30:09.177 "flush": false, 00:30:09.177 "reset": false, 00:30:09.177 "nvme_admin": false, 00:30:09.177 "nvme_io": false, 00:30:09.177 "nvme_io_md": false, 00:30:09.177 "write_zeroes": true, 00:30:09.177 "zcopy": false, 00:30:09.177 "get_zone_info": false, 00:30:09.177 "zone_management": false, 00:30:09.177 "zone_append": false, 00:30:09.177 "compare": false, 00:30:09.177 "compare_and_write": false, 00:30:09.177 "abort": false, 00:30:09.177 "seek_hole": false, 00:30:09.177 "seek_data": false, 00:30:09.177 "copy": false, 00:30:09.177 "nvme_iov_md": false 00:30:09.177 }, 00:30:09.177 "driver_specific": { 00:30:09.177 "compress": { 00:30:09.177 "name": "COMP_lvs0/lv0", 00:30:09.177 "base_bdev_name": "d94e99ba-af7d-4a9c-a2df-cd1ceb53dc27" 00:30:09.177 } 00:30:09.177 } 00:30:09.177 } 00:30:09.177 ] 00:30:09.177 10:56:44 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:09.177 10:56:44 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:09.177 [2024-07-12 10:56:44.358357] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc78c1b1350 PMD being used: compress_qat 00:30:09.177 I/O targets: 00:30:09.177 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:30:09.177 00:30:09.177 00:30:09.178 CUnit - A unit testing framework for C - Version 2.1-3 00:30:09.178 http://cunit.sourceforge.net/ 00:30:09.178 00:30:09.178 00:30:09.178 Suite: bdevio tests on: COMP_lvs0/lv0 00:30:09.178 Test: blockdev write read block ...passed 00:30:09.178 Test: blockdev write zeroes read block ...passed 00:30:09.178 Test: blockdev write zeroes read no split ...passed 00:30:09.435 Test: blockdev write zeroes read split ...passed 00:30:09.436 Test: blockdev write zeroes read split partial ...passed 00:30:09.436 Test: blockdev reset ...[2024-07-12 10:56:44.395912] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:30:09.436 passed 00:30:09.436 Test: blockdev write read 8 blocks ...passed 00:30:09.436 Test: blockdev write read size > 128k ...passed 00:30:09.436 Test: blockdev write read invalid size ...passed 00:30:09.436 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:09.436 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:09.436 Test: blockdev write read max offset ...passed 00:30:09.436 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:09.436 Test: blockdev writev readv 8 blocks ...passed 00:30:09.436 Test: blockdev writev readv 30 x 1block ...passed 00:30:09.436 Test: blockdev writev readv block ...passed 00:30:09.436 Test: blockdev writev readv size > 128k ...passed 00:30:09.436 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:09.436 Test: blockdev comparev and writev ...passed 00:30:09.436 Test: blockdev nvme passthru rw ...passed 00:30:09.436 Test: blockdev nvme passthru vendor specific ...passed 00:30:09.436 Test: blockdev nvme admin passthru ...passed 00:30:09.436 Test: blockdev copy ...passed 00:30:09.436 00:30:09.436 Run Summary: Type Total Ran Passed Failed Inactive 00:30:09.436 suites 1 1 n/a 0 0 00:30:09.436 tests 23 23 23 0 0 00:30:09.436 asserts 130 130 130 0 n/a 00:30:09.436 00:30:09.436 Elapsed time = 0.091 seconds 00:30:09.436 0 00:30:09.436 10:56:44 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:30:09.436 10:56:44 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:09.694 10:56:44 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:09.951 10:56:44 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:30:09.952 10:56:44 compress_compdev -- compress/compress.sh@62 -- # killprocess 2186460 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2186460 ']' 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2186460 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2186460 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2186460' 00:30:09.952 killing process with pid 2186460 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@967 -- # kill 2186460 00:30:09.952 10:56:44 compress_compdev -- common/autotest_common.sh@972 -- # wait 2186460 00:30:13.235 10:56:47 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:30:13.235 10:56:47 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:30:13.235 00:30:13.235 real 0m48.032s 00:30:13.235 user 1m50.860s 00:30:13.235 sys 0m5.868s 00:30:13.235 10:56:48 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:13.235 10:56:48 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:13.235 ************************************ 00:30:13.235 END TEST compress_compdev 00:30:13.235 ************************************ 00:30:13.235 10:56:48 -- common/autotest_common.sh@1142 -- # return 0 00:30:13.235 10:56:48 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:30:13.235 10:56:48 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:13.235 10:56:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:13.235 10:56:48 -- common/autotest_common.sh@10 -- # set +x 00:30:13.235 ************************************ 00:30:13.235 START TEST compress_isal 00:30:13.235 ************************************ 00:30:13.235 10:56:48 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:30:13.235 * Looking for test storage... 00:30:13.235 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:13.235 10:56:48 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:13.235 10:56:48 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:13.235 10:56:48 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:13.235 10:56:48 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:13.235 10:56:48 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:13.235 10:56:48 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:13.235 10:56:48 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:13.235 10:56:48 compress_isal -- paths/export.sh@5 -- # export PATH 00:30:13.235 10:56:48 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:13.235 10:56:48 compress_isal -- nvmf/common.sh@47 -- # : 0 00:30:13.236 10:56:48 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:13.236 10:56:48 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:13.236 10:56:48 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:13.236 10:56:48 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:13.236 10:56:48 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:13.236 10:56:48 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:13.236 10:56:48 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:13.236 10:56:48 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:13.236 10:56:48 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:13.236 10:56:48 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:13.236 10:56:48 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:30:13.236 10:56:48 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:13.236 10:56:48 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:13.236 10:56:48 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2187916 00:30:13.236 10:56:48 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:13.236 10:56:48 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:13.236 10:56:48 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2187916 00:30:13.236 10:56:48 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2187916 ']' 00:30:13.236 10:56:48 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:13.236 10:56:48 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:13.236 10:56:48 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:13.236 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:13.236 10:56:48 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:13.236 10:56:48 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:13.236 [2024-07-12 10:56:48.286955] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:30:13.236 [2024-07-12 10:56:48.287028] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2187916 ] 00:30:13.236 [2024-07-12 10:56:48.405396] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:13.494 [2024-07-12 10:56:48.509786] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:13.494 [2024-07-12 10:56:48.509794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:14.059 10:56:49 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:14.059 10:56:49 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:14.059 10:56:49 compress_isal -- compress/compress.sh@74 -- # create_vols 00:30:14.060 10:56:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:14.060 10:56:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:14.625 10:56:49 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:14.625 10:56:49 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:14.625 10:56:49 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:14.625 10:56:49 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:14.625 10:56:49 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:14.625 10:56:49 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:14.625 10:56:49 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:15.192 10:56:50 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:15.192 [ 00:30:15.192 { 00:30:15.192 "name": "Nvme0n1", 00:30:15.192 "aliases": [ 00:30:15.192 "01000000-0000-0000-5cd2-e43197705251" 00:30:15.192 ], 00:30:15.192 "product_name": "NVMe disk", 00:30:15.192 "block_size": 512, 00:30:15.192 "num_blocks": 15002931888, 00:30:15.192 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:15.192 "assigned_rate_limits": { 00:30:15.192 "rw_ios_per_sec": 0, 00:30:15.192 "rw_mbytes_per_sec": 0, 00:30:15.192 "r_mbytes_per_sec": 0, 00:30:15.192 "w_mbytes_per_sec": 0 00:30:15.192 }, 00:30:15.192 "claimed": false, 00:30:15.192 "zoned": false, 00:30:15.192 "supported_io_types": { 00:30:15.192 "read": true, 00:30:15.192 "write": true, 00:30:15.192 "unmap": true, 00:30:15.192 "flush": true, 00:30:15.192 "reset": true, 00:30:15.192 "nvme_admin": true, 00:30:15.192 "nvme_io": true, 00:30:15.192 "nvme_io_md": false, 00:30:15.192 "write_zeroes": true, 00:30:15.192 "zcopy": false, 00:30:15.192 "get_zone_info": false, 00:30:15.192 "zone_management": false, 00:30:15.192 "zone_append": false, 00:30:15.192 "compare": false, 00:30:15.192 "compare_and_write": false, 00:30:15.192 "abort": true, 00:30:15.192 "seek_hole": false, 00:30:15.192 "seek_data": false, 00:30:15.192 "copy": false, 00:30:15.192 "nvme_iov_md": false 00:30:15.192 }, 00:30:15.192 "driver_specific": { 00:30:15.192 "nvme": [ 00:30:15.192 { 00:30:15.192 "pci_address": "0000:5e:00.0", 00:30:15.192 "trid": { 00:30:15.192 "trtype": "PCIe", 00:30:15.192 "traddr": "0000:5e:00.0" 00:30:15.192 }, 00:30:15.192 "ctrlr_data": { 00:30:15.192 "cntlid": 0, 00:30:15.192 "vendor_id": "0x8086", 00:30:15.192 "model_number": "INTEL SSDPF2KX076TZO", 00:30:15.192 "serial_number": "PHAC0301002G7P6CGN", 00:30:15.192 "firmware_revision": "JCV10200", 00:30:15.192 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:15.192 "oacs": { 00:30:15.192 "security": 1, 00:30:15.192 "format": 1, 00:30:15.192 "firmware": 1, 00:30:15.192 "ns_manage": 1 00:30:15.192 }, 00:30:15.192 "multi_ctrlr": false, 00:30:15.192 "ana_reporting": false 00:30:15.192 }, 00:30:15.192 "vs": { 00:30:15.192 "nvme_version": "1.3" 00:30:15.192 }, 00:30:15.192 "ns_data": { 00:30:15.192 "id": 1, 00:30:15.192 "can_share": false 00:30:15.192 }, 00:30:15.192 "security": { 00:30:15.192 "opal": true 00:30:15.192 } 00:30:15.192 } 00:30:15.192 ], 00:30:15.192 "mp_policy": "active_passive" 00:30:15.192 } 00:30:15.192 } 00:30:15.192 ] 00:30:15.192 10:56:50 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:15.192 10:56:50 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:17.776 c4720875-4619-461c-ae17-aeccd57a0c9a 00:30:17.776 10:56:52 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:18.034 e63bba85-30f9-4420-be1b-5d1ae35c35e3 00:30:18.034 10:56:53 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:18.034 10:56:53 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:18.034 10:56:53 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:18.034 10:56:53 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:18.034 10:56:53 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:18.034 10:56:53 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:18.034 10:56:53 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:18.291 10:56:53 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:18.291 [ 00:30:18.291 { 00:30:18.291 "name": "e63bba85-30f9-4420-be1b-5d1ae35c35e3", 00:30:18.291 "aliases": [ 00:30:18.291 "lvs0/lv0" 00:30:18.291 ], 00:30:18.291 "product_name": "Logical Volume", 00:30:18.291 "block_size": 512, 00:30:18.291 "num_blocks": 204800, 00:30:18.291 "uuid": "e63bba85-30f9-4420-be1b-5d1ae35c35e3", 00:30:18.291 "assigned_rate_limits": { 00:30:18.291 "rw_ios_per_sec": 0, 00:30:18.291 "rw_mbytes_per_sec": 0, 00:30:18.291 "r_mbytes_per_sec": 0, 00:30:18.291 "w_mbytes_per_sec": 0 00:30:18.291 }, 00:30:18.291 "claimed": false, 00:30:18.291 "zoned": false, 00:30:18.291 "supported_io_types": { 00:30:18.291 "read": true, 00:30:18.291 "write": true, 00:30:18.291 "unmap": true, 00:30:18.291 "flush": false, 00:30:18.291 "reset": true, 00:30:18.291 "nvme_admin": false, 00:30:18.291 "nvme_io": false, 00:30:18.291 "nvme_io_md": false, 00:30:18.291 "write_zeroes": true, 00:30:18.291 "zcopy": false, 00:30:18.291 "get_zone_info": false, 00:30:18.291 "zone_management": false, 00:30:18.291 "zone_append": false, 00:30:18.291 "compare": false, 00:30:18.291 "compare_and_write": false, 00:30:18.291 "abort": false, 00:30:18.291 "seek_hole": true, 00:30:18.291 "seek_data": true, 00:30:18.291 "copy": false, 00:30:18.291 "nvme_iov_md": false 00:30:18.291 }, 00:30:18.291 "driver_specific": { 00:30:18.291 "lvol": { 00:30:18.291 "lvol_store_uuid": "c4720875-4619-461c-ae17-aeccd57a0c9a", 00:30:18.291 "base_bdev": "Nvme0n1", 00:30:18.291 "thin_provision": true, 00:30:18.291 "num_allocated_clusters": 0, 00:30:18.291 "snapshot": false, 00:30:18.291 "clone": false, 00:30:18.291 "esnap_clone": false 00:30:18.291 } 00:30:18.291 } 00:30:18.291 } 00:30:18.291 ] 00:30:18.549 10:56:53 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:18.549 10:56:53 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:18.549 10:56:53 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:18.549 [2024-07-12 10:56:53.727635] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:18.549 COMP_lvs0/lv0 00:30:18.549 10:56:53 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:18.549 10:56:53 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:18.549 10:56:53 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:18.549 10:56:53 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:18.806 10:56:53 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:18.806 10:56:53 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:18.806 10:56:53 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:18.806 10:56:53 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:19.064 [ 00:30:19.064 { 00:30:19.065 "name": "COMP_lvs0/lv0", 00:30:19.065 "aliases": [ 00:30:19.065 "3b7798db-0eb9-57e5-9d8a-9acd48ff15af" 00:30:19.065 ], 00:30:19.065 "product_name": "compress", 00:30:19.065 "block_size": 512, 00:30:19.065 "num_blocks": 200704, 00:30:19.065 "uuid": "3b7798db-0eb9-57e5-9d8a-9acd48ff15af", 00:30:19.065 "assigned_rate_limits": { 00:30:19.065 "rw_ios_per_sec": 0, 00:30:19.065 "rw_mbytes_per_sec": 0, 00:30:19.065 "r_mbytes_per_sec": 0, 00:30:19.065 "w_mbytes_per_sec": 0 00:30:19.065 }, 00:30:19.065 "claimed": false, 00:30:19.065 "zoned": false, 00:30:19.065 "supported_io_types": { 00:30:19.065 "read": true, 00:30:19.065 "write": true, 00:30:19.065 "unmap": false, 00:30:19.065 "flush": false, 00:30:19.065 "reset": false, 00:30:19.065 "nvme_admin": false, 00:30:19.065 "nvme_io": false, 00:30:19.065 "nvme_io_md": false, 00:30:19.065 "write_zeroes": true, 00:30:19.065 "zcopy": false, 00:30:19.065 "get_zone_info": false, 00:30:19.065 "zone_management": false, 00:30:19.065 "zone_append": false, 00:30:19.065 "compare": false, 00:30:19.065 "compare_and_write": false, 00:30:19.065 "abort": false, 00:30:19.065 "seek_hole": false, 00:30:19.065 "seek_data": false, 00:30:19.065 "copy": false, 00:30:19.065 "nvme_iov_md": false 00:30:19.065 }, 00:30:19.065 "driver_specific": { 00:30:19.065 "compress": { 00:30:19.065 "name": "COMP_lvs0/lv0", 00:30:19.065 "base_bdev_name": "e63bba85-30f9-4420-be1b-5d1ae35c35e3" 00:30:19.065 } 00:30:19.065 } 00:30:19.065 } 00:30:19.065 ] 00:30:19.065 10:56:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:19.065 10:56:54 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:19.324 Running I/O for 3 seconds... 00:30:22.608 00:30:22.608 Latency(us) 00:30:22.608 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:22.608 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:22.608 Verification LBA range: start 0x0 length 0x3100 00:30:22.608 COMP_lvs0/lv0 : 3.00 3911.62 15.28 0.00 0.00 8126.74 605.50 7921.31 00:30:22.608 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:22.608 Verification LBA range: start 0x3100 length 0x3100 00:30:22.608 COMP_lvs0/lv0 : 3.00 3914.22 15.29 0.00 0.00 8133.99 537.82 7807.33 00:30:22.608 =================================================================================================================== 00:30:22.608 Total : 7825.84 30.57 0.00 0.00 8130.37 537.82 7921.31 00:30:22.608 0 00:30:22.608 10:56:57 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:22.608 10:56:57 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:22.608 10:56:57 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:22.866 10:56:57 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:22.866 10:56:57 compress_isal -- compress/compress.sh@78 -- # killprocess 2187916 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2187916 ']' 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2187916 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2187916 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2187916' 00:30:22.866 killing process with pid 2187916 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@967 -- # kill 2187916 00:30:22.866 Received shutdown signal, test time was about 3.000000 seconds 00:30:22.866 00:30:22.866 Latency(us) 00:30:22.866 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:22.866 =================================================================================================================== 00:30:22.866 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:22.866 10:56:57 compress_isal -- common/autotest_common.sh@972 -- # wait 2187916 00:30:26.145 10:57:00 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:26.145 10:57:00 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:26.145 10:57:00 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2189567 00:30:26.145 10:57:00 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:26.145 10:57:00 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:26.146 10:57:00 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2189567 00:30:26.146 10:57:00 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2189567 ']' 00:30:26.146 10:57:00 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:26.146 10:57:00 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:26.146 10:57:00 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:26.146 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:26.146 10:57:00 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:26.146 10:57:00 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:26.146 [2024-07-12 10:57:00.939569] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:30:26.146 [2024-07-12 10:57:00.939637] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2189567 ] 00:30:26.146 [2024-07-12 10:57:01.060111] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:26.146 [2024-07-12 10:57:01.166356] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:26.146 [2024-07-12 10:57:01.166361] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:26.711 10:57:01 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:26.711 10:57:01 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:26.711 10:57:01 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:30:26.711 10:57:01 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:26.711 10:57:01 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:27.645 10:57:02 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:27.645 10:57:02 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:27.645 10:57:02 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:27.645 10:57:02 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:27.645 10:57:02 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:27.645 10:57:02 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:27.645 10:57:02 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:27.645 10:57:02 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:27.908 [ 00:30:27.908 { 00:30:27.908 "name": "Nvme0n1", 00:30:27.908 "aliases": [ 00:30:27.908 "01000000-0000-0000-5cd2-e43197705251" 00:30:27.908 ], 00:30:27.908 "product_name": "NVMe disk", 00:30:27.908 "block_size": 512, 00:30:27.908 "num_blocks": 15002931888, 00:30:27.908 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:27.908 "assigned_rate_limits": { 00:30:27.908 "rw_ios_per_sec": 0, 00:30:27.908 "rw_mbytes_per_sec": 0, 00:30:27.908 "r_mbytes_per_sec": 0, 00:30:27.908 "w_mbytes_per_sec": 0 00:30:27.908 }, 00:30:27.908 "claimed": false, 00:30:27.908 "zoned": false, 00:30:27.908 "supported_io_types": { 00:30:27.908 "read": true, 00:30:27.908 "write": true, 00:30:27.908 "unmap": true, 00:30:27.908 "flush": true, 00:30:27.908 "reset": true, 00:30:27.908 "nvme_admin": true, 00:30:27.908 "nvme_io": true, 00:30:27.908 "nvme_io_md": false, 00:30:27.908 "write_zeroes": true, 00:30:27.908 "zcopy": false, 00:30:27.908 "get_zone_info": false, 00:30:27.908 "zone_management": false, 00:30:27.908 "zone_append": false, 00:30:27.908 "compare": false, 00:30:27.908 "compare_and_write": false, 00:30:27.908 "abort": true, 00:30:27.908 "seek_hole": false, 00:30:27.908 "seek_data": false, 00:30:27.908 "copy": false, 00:30:27.908 "nvme_iov_md": false 00:30:27.908 }, 00:30:27.908 "driver_specific": { 00:30:27.908 "nvme": [ 00:30:27.908 { 00:30:27.908 "pci_address": "0000:5e:00.0", 00:30:27.908 "trid": { 00:30:27.908 "trtype": "PCIe", 00:30:27.908 "traddr": "0000:5e:00.0" 00:30:27.908 }, 00:30:27.908 "ctrlr_data": { 00:30:27.908 "cntlid": 0, 00:30:27.908 "vendor_id": "0x8086", 00:30:27.908 "model_number": "INTEL SSDPF2KX076TZO", 00:30:27.908 "serial_number": "PHAC0301002G7P6CGN", 00:30:27.908 "firmware_revision": "JCV10200", 00:30:27.908 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:27.908 "oacs": { 00:30:27.908 "security": 1, 00:30:27.908 "format": 1, 00:30:27.908 "firmware": 1, 00:30:27.908 "ns_manage": 1 00:30:27.908 }, 00:30:27.908 "multi_ctrlr": false, 00:30:27.908 "ana_reporting": false 00:30:27.908 }, 00:30:27.908 "vs": { 00:30:27.908 "nvme_version": "1.3" 00:30:27.908 }, 00:30:27.908 "ns_data": { 00:30:27.908 "id": 1, 00:30:27.908 "can_share": false 00:30:27.908 }, 00:30:27.908 "security": { 00:30:27.908 "opal": true 00:30:27.908 } 00:30:27.908 } 00:30:27.908 ], 00:30:27.908 "mp_policy": "active_passive" 00:30:27.908 } 00:30:27.908 } 00:30:27.908 ] 00:30:27.908 10:57:02 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:27.908 10:57:02 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:30.438 6bf4e6ec-7356-4791-837e-3c9be6a00c8f 00:30:30.438 10:57:05 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:30.438 50d82cfc-c114-416f-99fc-a0259c13d521 00:30:30.438 10:57:05 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:30.438 10:57:05 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:30.438 10:57:05 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:30.438 10:57:05 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:30.438 10:57:05 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:30.438 10:57:05 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:30.438 10:57:05 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:31.004 10:57:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:31.262 [ 00:30:31.262 { 00:30:31.262 "name": "50d82cfc-c114-416f-99fc-a0259c13d521", 00:30:31.262 "aliases": [ 00:30:31.262 "lvs0/lv0" 00:30:31.262 ], 00:30:31.262 "product_name": "Logical Volume", 00:30:31.262 "block_size": 512, 00:30:31.262 "num_blocks": 204800, 00:30:31.262 "uuid": "50d82cfc-c114-416f-99fc-a0259c13d521", 00:30:31.262 "assigned_rate_limits": { 00:30:31.262 "rw_ios_per_sec": 0, 00:30:31.262 "rw_mbytes_per_sec": 0, 00:30:31.262 "r_mbytes_per_sec": 0, 00:30:31.262 "w_mbytes_per_sec": 0 00:30:31.262 }, 00:30:31.262 "claimed": false, 00:30:31.262 "zoned": false, 00:30:31.262 "supported_io_types": { 00:30:31.262 "read": true, 00:30:31.262 "write": true, 00:30:31.262 "unmap": true, 00:30:31.262 "flush": false, 00:30:31.262 "reset": true, 00:30:31.262 "nvme_admin": false, 00:30:31.262 "nvme_io": false, 00:30:31.262 "nvme_io_md": false, 00:30:31.262 "write_zeroes": true, 00:30:31.262 "zcopy": false, 00:30:31.262 "get_zone_info": false, 00:30:31.262 "zone_management": false, 00:30:31.262 "zone_append": false, 00:30:31.262 "compare": false, 00:30:31.262 "compare_and_write": false, 00:30:31.262 "abort": false, 00:30:31.262 "seek_hole": true, 00:30:31.262 "seek_data": true, 00:30:31.262 "copy": false, 00:30:31.262 "nvme_iov_md": false 00:30:31.262 }, 00:30:31.262 "driver_specific": { 00:30:31.262 "lvol": { 00:30:31.262 "lvol_store_uuid": "6bf4e6ec-7356-4791-837e-3c9be6a00c8f", 00:30:31.262 "base_bdev": "Nvme0n1", 00:30:31.262 "thin_provision": true, 00:30:31.262 "num_allocated_clusters": 0, 00:30:31.262 "snapshot": false, 00:30:31.262 "clone": false, 00:30:31.262 "esnap_clone": false 00:30:31.262 } 00:30:31.262 } 00:30:31.262 } 00:30:31.262 ] 00:30:31.262 10:57:06 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:31.262 10:57:06 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:31.262 10:57:06 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:31.521 [2024-07-12 10:57:06.579824] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:31.521 COMP_lvs0/lv0 00:30:31.521 10:57:06 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:31.521 10:57:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:31.521 10:57:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:31.521 10:57:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:31.521 10:57:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:31.521 10:57:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:31.521 10:57:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:31.779 10:57:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:32.037 [ 00:30:32.037 { 00:30:32.037 "name": "COMP_lvs0/lv0", 00:30:32.037 "aliases": [ 00:30:32.037 "d9611aba-feb8-50b7-a9f3-8e39d1f58d14" 00:30:32.037 ], 00:30:32.037 "product_name": "compress", 00:30:32.037 "block_size": 512, 00:30:32.037 "num_blocks": 200704, 00:30:32.037 "uuid": "d9611aba-feb8-50b7-a9f3-8e39d1f58d14", 00:30:32.037 "assigned_rate_limits": { 00:30:32.037 "rw_ios_per_sec": 0, 00:30:32.037 "rw_mbytes_per_sec": 0, 00:30:32.037 "r_mbytes_per_sec": 0, 00:30:32.037 "w_mbytes_per_sec": 0 00:30:32.037 }, 00:30:32.037 "claimed": false, 00:30:32.037 "zoned": false, 00:30:32.037 "supported_io_types": { 00:30:32.037 "read": true, 00:30:32.037 "write": true, 00:30:32.037 "unmap": false, 00:30:32.037 "flush": false, 00:30:32.037 "reset": false, 00:30:32.037 "nvme_admin": false, 00:30:32.037 "nvme_io": false, 00:30:32.037 "nvme_io_md": false, 00:30:32.037 "write_zeroes": true, 00:30:32.037 "zcopy": false, 00:30:32.037 "get_zone_info": false, 00:30:32.037 "zone_management": false, 00:30:32.037 "zone_append": false, 00:30:32.037 "compare": false, 00:30:32.037 "compare_and_write": false, 00:30:32.037 "abort": false, 00:30:32.037 "seek_hole": false, 00:30:32.037 "seek_data": false, 00:30:32.037 "copy": false, 00:30:32.037 "nvme_iov_md": false 00:30:32.037 }, 00:30:32.037 "driver_specific": { 00:30:32.037 "compress": { 00:30:32.037 "name": "COMP_lvs0/lv0", 00:30:32.037 "base_bdev_name": "50d82cfc-c114-416f-99fc-a0259c13d521" 00:30:32.037 } 00:30:32.037 } 00:30:32.037 } 00:30:32.037 ] 00:30:32.037 10:57:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:32.037 10:57:07 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:32.037 Running I/O for 3 seconds... 00:30:35.321 00:30:35.321 Latency(us) 00:30:35.321 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:35.321 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:35.321 Verification LBA range: start 0x0 length 0x3100 00:30:35.321 COMP_lvs0/lv0 : 3.00 3901.50 15.24 0.00 0.00 8146.91 694.54 9459.98 00:30:35.321 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:35.321 Verification LBA range: start 0x3100 length 0x3100 00:30:35.321 COMP_lvs0/lv0 : 3.00 3906.76 15.26 0.00 0.00 8148.93 662.48 9402.99 00:30:35.321 =================================================================================================================== 00:30:35.321 Total : 7808.26 30.50 0.00 0.00 8147.92 662.48 9459.98 00:30:35.321 0 00:30:35.321 10:57:10 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:35.321 10:57:10 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:35.321 10:57:10 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:35.580 10:57:10 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:35.580 10:57:10 compress_isal -- compress/compress.sh@78 -- # killprocess 2189567 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2189567 ']' 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2189567 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2189567 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2189567' 00:30:35.580 killing process with pid 2189567 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@967 -- # kill 2189567 00:30:35.580 Received shutdown signal, test time was about 3.000000 seconds 00:30:35.580 00:30:35.580 Latency(us) 00:30:35.580 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:35.580 =================================================================================================================== 00:30:35.580 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:35.580 10:57:10 compress_isal -- common/autotest_common.sh@972 -- # wait 2189567 00:30:38.859 10:57:13 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:38.859 10:57:13 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:38.859 10:57:13 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2191626 00:30:38.859 10:57:13 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:38.859 10:57:13 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:38.859 10:57:13 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2191626 00:30:38.859 10:57:13 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2191626 ']' 00:30:38.859 10:57:13 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:38.859 10:57:13 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:38.859 10:57:13 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:38.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:38.859 10:57:13 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:38.859 10:57:13 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:38.859 [2024-07-12 10:57:13.738684] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:30:38.859 [2024-07-12 10:57:13.738742] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2191626 ] 00:30:38.859 [2024-07-12 10:57:13.838459] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:38.859 [2024-07-12 10:57:13.942485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:38.859 [2024-07-12 10:57:13.942491] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:39.822 10:57:14 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:39.822 10:57:14 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:39.822 10:57:14 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:30:39.822 10:57:14 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:39.822 10:57:14 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:40.399 10:57:15 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:40.399 10:57:15 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:40.399 10:57:15 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:40.399 10:57:15 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:40.399 10:57:15 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:40.399 10:57:15 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:40.399 10:57:15 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:40.399 10:57:15 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:40.658 [ 00:30:40.658 { 00:30:40.658 "name": "Nvme0n1", 00:30:40.658 "aliases": [ 00:30:40.658 "01000000-0000-0000-5cd2-e43197705251" 00:30:40.658 ], 00:30:40.658 "product_name": "NVMe disk", 00:30:40.658 "block_size": 512, 00:30:40.658 "num_blocks": 15002931888, 00:30:40.658 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:40.658 "assigned_rate_limits": { 00:30:40.658 "rw_ios_per_sec": 0, 00:30:40.658 "rw_mbytes_per_sec": 0, 00:30:40.658 "r_mbytes_per_sec": 0, 00:30:40.658 "w_mbytes_per_sec": 0 00:30:40.658 }, 00:30:40.658 "claimed": false, 00:30:40.658 "zoned": false, 00:30:40.658 "supported_io_types": { 00:30:40.658 "read": true, 00:30:40.658 "write": true, 00:30:40.658 "unmap": true, 00:30:40.658 "flush": true, 00:30:40.658 "reset": true, 00:30:40.658 "nvme_admin": true, 00:30:40.658 "nvme_io": true, 00:30:40.658 "nvme_io_md": false, 00:30:40.658 "write_zeroes": true, 00:30:40.658 "zcopy": false, 00:30:40.658 "get_zone_info": false, 00:30:40.658 "zone_management": false, 00:30:40.658 "zone_append": false, 00:30:40.658 "compare": false, 00:30:40.658 "compare_and_write": false, 00:30:40.658 "abort": true, 00:30:40.658 "seek_hole": false, 00:30:40.658 "seek_data": false, 00:30:40.658 "copy": false, 00:30:40.658 "nvme_iov_md": false 00:30:40.658 }, 00:30:40.658 "driver_specific": { 00:30:40.658 "nvme": [ 00:30:40.658 { 00:30:40.658 "pci_address": "0000:5e:00.0", 00:30:40.658 "trid": { 00:30:40.658 "trtype": "PCIe", 00:30:40.658 "traddr": "0000:5e:00.0" 00:30:40.658 }, 00:30:40.658 "ctrlr_data": { 00:30:40.658 "cntlid": 0, 00:30:40.658 "vendor_id": "0x8086", 00:30:40.658 "model_number": "INTEL SSDPF2KX076TZO", 00:30:40.658 "serial_number": "PHAC0301002G7P6CGN", 00:30:40.658 "firmware_revision": "JCV10200", 00:30:40.658 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:40.658 "oacs": { 00:30:40.658 "security": 1, 00:30:40.658 "format": 1, 00:30:40.658 "firmware": 1, 00:30:40.658 "ns_manage": 1 00:30:40.658 }, 00:30:40.658 "multi_ctrlr": false, 00:30:40.658 "ana_reporting": false 00:30:40.658 }, 00:30:40.658 "vs": { 00:30:40.658 "nvme_version": "1.3" 00:30:40.658 }, 00:30:40.658 "ns_data": { 00:30:40.658 "id": 1, 00:30:40.658 "can_share": false 00:30:40.658 }, 00:30:40.658 "security": { 00:30:40.658 "opal": true 00:30:40.658 } 00:30:40.658 } 00:30:40.658 ], 00:30:40.658 "mp_policy": "active_passive" 00:30:40.658 } 00:30:40.658 } 00:30:40.658 ] 00:30:40.658 10:57:15 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:40.658 10:57:15 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:43.189 f4f5f348-1823-4e50-9deb-06f227ad21d1 00:30:43.189 10:57:18 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:43.447 9a6706e8-6a1d-454c-8da7-6865bf7ada23 00:30:43.447 10:57:18 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:43.447 10:57:18 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:43.447 10:57:18 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:43.447 10:57:18 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:43.447 10:57:18 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:43.447 10:57:18 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:43.447 10:57:18 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:43.705 10:57:18 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:43.962 [ 00:30:43.962 { 00:30:43.962 "name": "9a6706e8-6a1d-454c-8da7-6865bf7ada23", 00:30:43.962 "aliases": [ 00:30:43.962 "lvs0/lv0" 00:30:43.962 ], 00:30:43.962 "product_name": "Logical Volume", 00:30:43.962 "block_size": 512, 00:30:43.962 "num_blocks": 204800, 00:30:43.962 "uuid": "9a6706e8-6a1d-454c-8da7-6865bf7ada23", 00:30:43.962 "assigned_rate_limits": { 00:30:43.962 "rw_ios_per_sec": 0, 00:30:43.962 "rw_mbytes_per_sec": 0, 00:30:43.962 "r_mbytes_per_sec": 0, 00:30:43.962 "w_mbytes_per_sec": 0 00:30:43.962 }, 00:30:43.962 "claimed": false, 00:30:43.963 "zoned": false, 00:30:43.963 "supported_io_types": { 00:30:43.963 "read": true, 00:30:43.963 "write": true, 00:30:43.963 "unmap": true, 00:30:43.963 "flush": false, 00:30:43.963 "reset": true, 00:30:43.963 "nvme_admin": false, 00:30:43.963 "nvme_io": false, 00:30:43.963 "nvme_io_md": false, 00:30:43.963 "write_zeroes": true, 00:30:43.963 "zcopy": false, 00:30:43.963 "get_zone_info": false, 00:30:43.963 "zone_management": false, 00:30:43.963 "zone_append": false, 00:30:43.963 "compare": false, 00:30:43.963 "compare_and_write": false, 00:30:43.963 "abort": false, 00:30:43.963 "seek_hole": true, 00:30:43.963 "seek_data": true, 00:30:43.963 "copy": false, 00:30:43.963 "nvme_iov_md": false 00:30:43.963 }, 00:30:43.963 "driver_specific": { 00:30:43.963 "lvol": { 00:30:43.963 "lvol_store_uuid": "f4f5f348-1823-4e50-9deb-06f227ad21d1", 00:30:43.963 "base_bdev": "Nvme0n1", 00:30:43.963 "thin_provision": true, 00:30:43.963 "num_allocated_clusters": 0, 00:30:43.963 "snapshot": false, 00:30:43.963 "clone": false, 00:30:43.963 "esnap_clone": false 00:30:43.963 } 00:30:43.963 } 00:30:43.963 } 00:30:43.963 ] 00:30:43.963 10:57:18 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:43.963 10:57:18 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:43.963 10:57:18 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:44.221 [2024-07-12 10:57:19.215500] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:44.221 COMP_lvs0/lv0 00:30:44.221 10:57:19 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:44.221 10:57:19 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:44.221 10:57:19 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:44.221 10:57:19 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:44.221 10:57:19 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:44.221 10:57:19 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:44.221 10:57:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:44.478 10:57:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:44.736 [ 00:30:44.736 { 00:30:44.736 "name": "COMP_lvs0/lv0", 00:30:44.736 "aliases": [ 00:30:44.736 "209e49e5-574c-5378-9999-f700eb1955d6" 00:30:44.737 ], 00:30:44.737 "product_name": "compress", 00:30:44.737 "block_size": 4096, 00:30:44.737 "num_blocks": 25088, 00:30:44.737 "uuid": "209e49e5-574c-5378-9999-f700eb1955d6", 00:30:44.737 "assigned_rate_limits": { 00:30:44.737 "rw_ios_per_sec": 0, 00:30:44.737 "rw_mbytes_per_sec": 0, 00:30:44.737 "r_mbytes_per_sec": 0, 00:30:44.737 "w_mbytes_per_sec": 0 00:30:44.737 }, 00:30:44.737 "claimed": false, 00:30:44.737 "zoned": false, 00:30:44.737 "supported_io_types": { 00:30:44.737 "read": true, 00:30:44.737 "write": true, 00:30:44.737 "unmap": false, 00:30:44.737 "flush": false, 00:30:44.737 "reset": false, 00:30:44.737 "nvme_admin": false, 00:30:44.737 "nvme_io": false, 00:30:44.737 "nvme_io_md": false, 00:30:44.737 "write_zeroes": true, 00:30:44.737 "zcopy": false, 00:30:44.737 "get_zone_info": false, 00:30:44.737 "zone_management": false, 00:30:44.737 "zone_append": false, 00:30:44.737 "compare": false, 00:30:44.737 "compare_and_write": false, 00:30:44.737 "abort": false, 00:30:44.737 "seek_hole": false, 00:30:44.737 "seek_data": false, 00:30:44.737 "copy": false, 00:30:44.737 "nvme_iov_md": false 00:30:44.737 }, 00:30:44.737 "driver_specific": { 00:30:44.737 "compress": { 00:30:44.737 "name": "COMP_lvs0/lv0", 00:30:44.737 "base_bdev_name": "9a6706e8-6a1d-454c-8da7-6865bf7ada23" 00:30:44.737 } 00:30:44.737 } 00:30:44.737 } 00:30:44.737 ] 00:30:44.737 10:57:19 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:44.737 10:57:19 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:44.737 Running I/O for 3 seconds... 00:30:48.026 00:30:48.026 Latency(us) 00:30:48.026 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:48.026 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:48.026 Verification LBA range: start 0x0 length 0x3100 00:30:48.026 COMP_lvs0/lv0 : 3.00 3952.56 15.44 0.00 0.00 8042.70 662.48 8890.10 00:30:48.026 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:48.026 Verification LBA range: start 0x3100 length 0x3100 00:30:48.026 COMP_lvs0/lv0 : 3.00 3956.53 15.46 0.00 0.00 8047.71 552.07 8947.09 00:30:48.026 =================================================================================================================== 00:30:48.026 Total : 7909.09 30.89 0.00 0.00 8045.21 552.07 8947.09 00:30:48.026 0 00:30:48.026 10:57:22 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:48.026 10:57:22 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:48.026 10:57:23 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:48.287 10:57:23 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:48.287 10:57:23 compress_isal -- compress/compress.sh@78 -- # killprocess 2191626 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2191626 ']' 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2191626 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2191626 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2191626' 00:30:48.287 killing process with pid 2191626 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@967 -- # kill 2191626 00:30:48.287 Received shutdown signal, test time was about 3.000000 seconds 00:30:48.287 00:30:48.287 Latency(us) 00:30:48.287 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:48.287 =================================================================================================================== 00:30:48.287 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:48.287 10:57:23 compress_isal -- common/autotest_common.sh@972 -- # wait 2191626 00:30:51.568 10:57:26 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:30:51.568 10:57:26 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:51.568 10:57:26 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=2193221 00:30:51.568 10:57:26 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:51.568 10:57:26 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:30:51.568 10:57:26 compress_isal -- compress/compress.sh@57 -- # waitforlisten 2193221 00:30:51.568 10:57:26 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2193221 ']' 00:30:51.568 10:57:26 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:51.568 10:57:26 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:51.568 10:57:26 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:51.568 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:51.568 10:57:26 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:51.568 10:57:26 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:51.568 [2024-07-12 10:57:26.464149] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:30:51.568 [2024-07-12 10:57:26.464226] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2193221 ] 00:30:51.568 [2024-07-12 10:57:26.596911] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:51.568 [2024-07-12 10:57:26.706874] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:51.568 [2024-07-12 10:57:26.706959] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:51.568 [2024-07-12 10:57:26.706964] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:52.501 10:57:27 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:52.501 10:57:27 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:52.501 10:57:27 compress_isal -- compress/compress.sh@58 -- # create_vols 00:30:52.501 10:57:27 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:52.501 10:57:27 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:53.068 10:57:27 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:53.068 10:57:27 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:53.068 10:57:27 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:53.068 10:57:27 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:53.068 10:57:27 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:53.068 10:57:27 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:53.068 10:57:27 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:53.068 10:57:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:53.326 [ 00:30:53.326 { 00:30:53.326 "name": "Nvme0n1", 00:30:53.326 "aliases": [ 00:30:53.326 "01000000-0000-0000-5cd2-e43197705251" 00:30:53.326 ], 00:30:53.326 "product_name": "NVMe disk", 00:30:53.326 "block_size": 512, 00:30:53.326 "num_blocks": 15002931888, 00:30:53.326 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:53.326 "assigned_rate_limits": { 00:30:53.326 "rw_ios_per_sec": 0, 00:30:53.326 "rw_mbytes_per_sec": 0, 00:30:53.326 "r_mbytes_per_sec": 0, 00:30:53.326 "w_mbytes_per_sec": 0 00:30:53.326 }, 00:30:53.326 "claimed": false, 00:30:53.326 "zoned": false, 00:30:53.326 "supported_io_types": { 00:30:53.326 "read": true, 00:30:53.326 "write": true, 00:30:53.326 "unmap": true, 00:30:53.326 "flush": true, 00:30:53.326 "reset": true, 00:30:53.326 "nvme_admin": true, 00:30:53.326 "nvme_io": true, 00:30:53.326 "nvme_io_md": false, 00:30:53.326 "write_zeroes": true, 00:30:53.326 "zcopy": false, 00:30:53.326 "get_zone_info": false, 00:30:53.326 "zone_management": false, 00:30:53.326 "zone_append": false, 00:30:53.326 "compare": false, 00:30:53.326 "compare_and_write": false, 00:30:53.326 "abort": true, 00:30:53.326 "seek_hole": false, 00:30:53.326 "seek_data": false, 00:30:53.326 "copy": false, 00:30:53.326 "nvme_iov_md": false 00:30:53.326 }, 00:30:53.326 "driver_specific": { 00:30:53.326 "nvme": [ 00:30:53.326 { 00:30:53.326 "pci_address": "0000:5e:00.0", 00:30:53.326 "trid": { 00:30:53.326 "trtype": "PCIe", 00:30:53.326 "traddr": "0000:5e:00.0" 00:30:53.326 }, 00:30:53.326 "ctrlr_data": { 00:30:53.326 "cntlid": 0, 00:30:53.326 "vendor_id": "0x8086", 00:30:53.326 "model_number": "INTEL SSDPF2KX076TZO", 00:30:53.326 "serial_number": "PHAC0301002G7P6CGN", 00:30:53.326 "firmware_revision": "JCV10200", 00:30:53.326 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:53.326 "oacs": { 00:30:53.326 "security": 1, 00:30:53.326 "format": 1, 00:30:53.326 "firmware": 1, 00:30:53.326 "ns_manage": 1 00:30:53.326 }, 00:30:53.326 "multi_ctrlr": false, 00:30:53.326 "ana_reporting": false 00:30:53.326 }, 00:30:53.326 "vs": { 00:30:53.326 "nvme_version": "1.3" 00:30:53.326 }, 00:30:53.326 "ns_data": { 00:30:53.326 "id": 1, 00:30:53.326 "can_share": false 00:30:53.326 }, 00:30:53.326 "security": { 00:30:53.326 "opal": true 00:30:53.326 } 00:30:53.326 } 00:30:53.326 ], 00:30:53.326 "mp_policy": "active_passive" 00:30:53.326 } 00:30:53.326 } 00:30:53.326 ] 00:30:53.326 10:57:28 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:53.326 10:57:28 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:55.857 3fdc1872-3a9d-45d8-b61f-e18872924068 00:30:55.857 10:57:30 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:56.115 5ff26369-cd69-4813-ba94-99dc199195a6 00:30:56.115 10:57:31 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:56.115 10:57:31 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:56.115 10:57:31 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:56.115 10:57:31 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:56.115 10:57:31 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:56.115 10:57:31 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:56.115 10:57:31 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:56.373 10:57:31 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:56.630 [ 00:30:56.630 { 00:30:56.630 "name": "5ff26369-cd69-4813-ba94-99dc199195a6", 00:30:56.630 "aliases": [ 00:30:56.630 "lvs0/lv0" 00:30:56.630 ], 00:30:56.630 "product_name": "Logical Volume", 00:30:56.630 "block_size": 512, 00:30:56.630 "num_blocks": 204800, 00:30:56.630 "uuid": "5ff26369-cd69-4813-ba94-99dc199195a6", 00:30:56.630 "assigned_rate_limits": { 00:30:56.630 "rw_ios_per_sec": 0, 00:30:56.630 "rw_mbytes_per_sec": 0, 00:30:56.630 "r_mbytes_per_sec": 0, 00:30:56.630 "w_mbytes_per_sec": 0 00:30:56.630 }, 00:30:56.630 "claimed": false, 00:30:56.630 "zoned": false, 00:30:56.630 "supported_io_types": { 00:30:56.630 "read": true, 00:30:56.630 "write": true, 00:30:56.630 "unmap": true, 00:30:56.630 "flush": false, 00:30:56.630 "reset": true, 00:30:56.630 "nvme_admin": false, 00:30:56.630 "nvme_io": false, 00:30:56.630 "nvme_io_md": false, 00:30:56.630 "write_zeroes": true, 00:30:56.630 "zcopy": false, 00:30:56.630 "get_zone_info": false, 00:30:56.630 "zone_management": false, 00:30:56.630 "zone_append": false, 00:30:56.630 "compare": false, 00:30:56.630 "compare_and_write": false, 00:30:56.630 "abort": false, 00:30:56.630 "seek_hole": true, 00:30:56.630 "seek_data": true, 00:30:56.630 "copy": false, 00:30:56.630 "nvme_iov_md": false 00:30:56.630 }, 00:30:56.630 "driver_specific": { 00:30:56.630 "lvol": { 00:30:56.630 "lvol_store_uuid": "3fdc1872-3a9d-45d8-b61f-e18872924068", 00:30:56.630 "base_bdev": "Nvme0n1", 00:30:56.630 "thin_provision": true, 00:30:56.630 "num_allocated_clusters": 0, 00:30:56.630 "snapshot": false, 00:30:56.630 "clone": false, 00:30:56.630 "esnap_clone": false 00:30:56.630 } 00:30:56.630 } 00:30:56.630 } 00:30:56.630 ] 00:30:56.630 10:57:31 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:56.630 10:57:31 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:56.630 10:57:31 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:56.887 [2024-07-12 10:57:31.909543] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:56.887 COMP_lvs0/lv0 00:30:56.887 10:57:31 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:56.887 10:57:31 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:56.887 10:57:31 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:56.887 10:57:31 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:56.887 10:57:31 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:56.887 10:57:31 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:56.887 10:57:31 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:57.143 10:57:32 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:57.402 [ 00:30:57.402 { 00:30:57.402 "name": "COMP_lvs0/lv0", 00:30:57.402 "aliases": [ 00:30:57.402 "258522b3-0f02-533a-9fd5-5a16c9ebdddf" 00:30:57.402 ], 00:30:57.402 "product_name": "compress", 00:30:57.402 "block_size": 512, 00:30:57.402 "num_blocks": 200704, 00:30:57.402 "uuid": "258522b3-0f02-533a-9fd5-5a16c9ebdddf", 00:30:57.402 "assigned_rate_limits": { 00:30:57.402 "rw_ios_per_sec": 0, 00:30:57.402 "rw_mbytes_per_sec": 0, 00:30:57.402 "r_mbytes_per_sec": 0, 00:30:57.402 "w_mbytes_per_sec": 0 00:30:57.402 }, 00:30:57.402 "claimed": false, 00:30:57.402 "zoned": false, 00:30:57.402 "supported_io_types": { 00:30:57.402 "read": true, 00:30:57.402 "write": true, 00:30:57.402 "unmap": false, 00:30:57.402 "flush": false, 00:30:57.402 "reset": false, 00:30:57.402 "nvme_admin": false, 00:30:57.402 "nvme_io": false, 00:30:57.402 "nvme_io_md": false, 00:30:57.402 "write_zeroes": true, 00:30:57.402 "zcopy": false, 00:30:57.402 "get_zone_info": false, 00:30:57.402 "zone_management": false, 00:30:57.402 "zone_append": false, 00:30:57.402 "compare": false, 00:30:57.402 "compare_and_write": false, 00:30:57.402 "abort": false, 00:30:57.402 "seek_hole": false, 00:30:57.402 "seek_data": false, 00:30:57.402 "copy": false, 00:30:57.402 "nvme_iov_md": false 00:30:57.402 }, 00:30:57.402 "driver_specific": { 00:30:57.402 "compress": { 00:30:57.402 "name": "COMP_lvs0/lv0", 00:30:57.402 "base_bdev_name": "5ff26369-cd69-4813-ba94-99dc199195a6" 00:30:57.402 } 00:30:57.402 } 00:30:57.402 } 00:30:57.402 ] 00:30:57.402 10:57:32 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:57.402 10:57:32 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:57.402 I/O targets: 00:30:57.402 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:30:57.402 00:30:57.402 00:30:57.402 CUnit - A unit testing framework for C - Version 2.1-3 00:30:57.402 http://cunit.sourceforge.net/ 00:30:57.402 00:30:57.402 00:30:57.402 Suite: bdevio tests on: COMP_lvs0/lv0 00:30:57.402 Test: blockdev write read block ...passed 00:30:57.402 Test: blockdev write zeroes read block ...passed 00:30:57.402 Test: blockdev write zeroes read no split ...passed 00:30:57.402 Test: blockdev write zeroes read split ...passed 00:30:57.402 Test: blockdev write zeroes read split partial ...passed 00:30:57.402 Test: blockdev reset ...[2024-07-12 10:57:32.577107] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:30:57.402 passed 00:30:57.402 Test: blockdev write read 8 blocks ...passed 00:30:57.402 Test: blockdev write read size > 128k ...passed 00:30:57.402 Test: blockdev write read invalid size ...passed 00:30:57.402 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:57.402 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:57.402 Test: blockdev write read max offset ...passed 00:30:57.402 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:57.402 Test: blockdev writev readv 8 blocks ...passed 00:30:57.402 Test: blockdev writev readv 30 x 1block ...passed 00:30:57.402 Test: blockdev writev readv block ...passed 00:30:57.402 Test: blockdev writev readv size > 128k ...passed 00:30:57.402 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:57.402 Test: blockdev comparev and writev ...passed 00:30:57.402 Test: blockdev nvme passthru rw ...passed 00:30:57.402 Test: blockdev nvme passthru vendor specific ...passed 00:30:57.402 Test: blockdev nvme admin passthru ...passed 00:30:57.402 Test: blockdev copy ...passed 00:30:57.402 00:30:57.402 Run Summary: Type Total Ran Passed Failed Inactive 00:30:57.402 suites 1 1 n/a 0 0 00:30:57.402 tests 23 23 23 0 0 00:30:57.402 asserts 130 130 130 0 n/a 00:30:57.402 00:30:57.402 Elapsed time = 0.109 seconds 00:30:57.402 0 00:30:57.660 10:57:32 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:30:57.660 10:57:32 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:57.660 10:57:32 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:57.918 10:57:33 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:30:57.918 10:57:33 compress_isal -- compress/compress.sh@62 -- # killprocess 2193221 00:30:57.918 10:57:33 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2193221 ']' 00:30:57.918 10:57:33 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2193221 00:30:57.918 10:57:33 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:57.918 10:57:33 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:57.918 10:57:33 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2193221 00:30:58.175 10:57:33 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:58.175 10:57:33 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:58.175 10:57:33 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2193221' 00:30:58.175 killing process with pid 2193221 00:30:58.175 10:57:33 compress_isal -- common/autotest_common.sh@967 -- # kill 2193221 00:30:58.175 10:57:33 compress_isal -- common/autotest_common.sh@972 -- # wait 2193221 00:31:01.457 10:57:35 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:01.457 10:57:35 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:01.457 00:31:01.457 real 0m47.823s 00:31:01.457 user 1m52.068s 00:31:01.457 sys 0m4.237s 00:31:01.457 10:57:35 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:01.457 10:57:35 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:01.457 ************************************ 00:31:01.457 END TEST compress_isal 00:31:01.457 ************************************ 00:31:01.457 10:57:35 -- common/autotest_common.sh@1142 -- # return 0 00:31:01.457 10:57:35 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:31:01.457 10:57:35 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:31:01.457 10:57:35 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:01.457 10:57:35 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:01.457 10:57:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:01.457 10:57:35 -- common/autotest_common.sh@10 -- # set +x 00:31:01.457 ************************************ 00:31:01.457 START TEST blockdev_crypto_aesni 00:31:01.457 ************************************ 00:31:01.457 10:57:35 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:01.457 * Looking for test storage... 00:31:01.457 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2194525 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 2194525 00:31:01.457 10:57:36 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 2194525 ']' 00:31:01.457 10:57:36 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:01.457 10:57:36 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:01.457 10:57:36 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:01.457 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:01.457 10:57:36 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:01.457 10:57:36 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:01.457 10:57:36 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:01.457 [2024-07-12 10:57:36.178336] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:31:01.458 [2024-07-12 10:57:36.178406] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2194525 ] 00:31:01.458 [2024-07-12 10:57:36.308742] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:01.458 [2024-07-12 10:57:36.412552] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:02.025 10:57:37 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:02.025 10:57:37 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:31:02.025 10:57:37 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:31:02.025 10:57:37 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:31:02.025 10:57:37 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:31:02.025 10:57:37 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:02.025 10:57:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:02.025 [2024-07-12 10:57:37.026538] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:02.026 [2024-07-12 10:57:37.034571] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:02.026 [2024-07-12 10:57:37.042588] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:02.026 [2024-07-12 10:57:37.114515] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:04.589 true 00:31:04.589 true 00:31:04.589 true 00:31:04.589 true 00:31:04.589 Malloc0 00:31:04.589 Malloc1 00:31:04.589 Malloc2 00:31:04.589 Malloc3 00:31:04.589 [2024-07-12 10:57:39.500363] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:04.589 crypto_ram 00:31:04.589 [2024-07-12 10:57:39.508381] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:04.589 crypto_ram2 00:31:04.589 [2024-07-12 10:57:39.516402] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:04.589 crypto_ram3 00:31:04.589 [2024-07-12 10:57:39.524424] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:04.589 crypto_ram4 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "9be9f94c-ece4-5bed-bbd4-8f24e4fbb91c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9be9f94c-ece4-5bed-bbd4-8f24e4fbb91c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "d33ea4c5-a3bc-524b-9a7f-5da4206c0870"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d33ea4c5-a3bc-524b-9a7f-5da4206c0870",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ee70e39d-9ac7-5843-99b7-b4d98ef14387"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ee70e39d-9ac7-5843-99b7-b4d98ef14387",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "7a1b0f70-0eea-52de-a31c-c4ae6d196919"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "7a1b0f70-0eea-52de-a31c-c4ae6d196919",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:31:04.589 10:57:39 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 2194525 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 2194525 ']' 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 2194525 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2194525 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2194525' 00:31:04.589 killing process with pid 2194525 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 2194525 00:31:04.589 10:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 2194525 00:31:05.156 10:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:05.156 10:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:05.156 10:57:40 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:31:05.156 10:57:40 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:05.156 10:57:40 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:05.156 ************************************ 00:31:05.156 START TEST bdev_hello_world 00:31:05.156 ************************************ 00:31:05.156 10:57:40 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:05.415 [2024-07-12 10:57:40.386757] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:31:05.415 [2024-07-12 10:57:40.386820] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2195069 ] 00:31:05.415 [2024-07-12 10:57:40.516061] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:05.674 [2024-07-12 10:57:40.617497] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:05.674 [2024-07-12 10:57:40.638806] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:05.674 [2024-07-12 10:57:40.646831] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:05.674 [2024-07-12 10:57:40.654858] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:05.674 [2024-07-12 10:57:40.764891] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:08.205 [2024-07-12 10:57:42.986216] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:08.205 [2024-07-12 10:57:42.986282] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:08.205 [2024-07-12 10:57:42.986297] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:08.205 [2024-07-12 10:57:42.994235] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:08.205 [2024-07-12 10:57:42.994255] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:08.205 [2024-07-12 10:57:42.994267] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:08.205 [2024-07-12 10:57:43.002257] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:08.205 [2024-07-12 10:57:43.002275] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:08.205 [2024-07-12 10:57:43.002287] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:08.205 [2024-07-12 10:57:43.010276] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:08.205 [2024-07-12 10:57:43.010293] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:08.205 [2024-07-12 10:57:43.010305] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:08.205 [2024-07-12 10:57:43.087687] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:31:08.205 [2024-07-12 10:57:43.087730] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:31:08.205 [2024-07-12 10:57:43.087751] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:31:08.205 [2024-07-12 10:57:43.089077] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:31:08.205 [2024-07-12 10:57:43.089148] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:31:08.205 [2024-07-12 10:57:43.089164] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:31:08.205 [2024-07-12 10:57:43.089209] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:31:08.205 00:31:08.205 [2024-07-12 10:57:43.089228] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:31:08.463 00:31:08.463 real 0m3.184s 00:31:08.463 user 0m2.764s 00:31:08.463 sys 0m0.385s 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:31:08.463 ************************************ 00:31:08.463 END TEST bdev_hello_world 00:31:08.463 ************************************ 00:31:08.463 10:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:08.463 10:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:31:08.463 10:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:08.463 10:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:08.463 10:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:08.463 ************************************ 00:31:08.463 START TEST bdev_bounds 00:31:08.463 ************************************ 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2195598 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2195598' 00:31:08.463 Process bdevio pid: 2195598 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2195598 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2195598 ']' 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:08.463 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:08.463 10:57:43 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:08.464 10:57:43 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:08.464 10:57:43 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:08.464 [2024-07-12 10:57:43.648743] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:31:08.464 [2024-07-12 10:57:43.648806] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2195598 ] 00:31:08.722 [2024-07-12 10:57:43.778330] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:08.722 [2024-07-12 10:57:43.885246] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:08.722 [2024-07-12 10:57:43.885331] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:08.722 [2024-07-12 10:57:43.885336] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:08.722 [2024-07-12 10:57:43.906696] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:08.722 [2024-07-12 10:57:43.914717] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:08.980 [2024-07-12 10:57:43.922735] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:08.980 [2024-07-12 10:57:44.027959] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:11.509 [2024-07-12 10:57:46.255954] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:11.509 [2024-07-12 10:57:46.256038] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:11.509 [2024-07-12 10:57:46.256053] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:11.509 [2024-07-12 10:57:46.263968] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:11.509 [2024-07-12 10:57:46.263988] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:11.509 [2024-07-12 10:57:46.263999] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:11.509 [2024-07-12 10:57:46.271996] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:11.509 [2024-07-12 10:57:46.272017] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:11.509 [2024-07-12 10:57:46.272029] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:11.509 [2024-07-12 10:57:46.280017] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:11.509 [2024-07-12 10:57:46.280035] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:11.509 [2024-07-12 10:57:46.280047] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:11.509 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:11.509 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:31:11.509 10:57:46 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:11.509 I/O targets: 00:31:11.509 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:31:11.509 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:31:11.509 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:31:11.509 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:31:11.509 00:31:11.509 00:31:11.509 CUnit - A unit testing framework for C - Version 2.1-3 00:31:11.509 http://cunit.sourceforge.net/ 00:31:11.509 00:31:11.509 00:31:11.509 Suite: bdevio tests on: crypto_ram4 00:31:11.509 Test: blockdev write read block ...passed 00:31:11.509 Test: blockdev write zeroes read block ...passed 00:31:11.509 Test: blockdev write zeroes read no split ...passed 00:31:11.509 Test: blockdev write zeroes read split ...passed 00:31:11.509 Test: blockdev write zeroes read split partial ...passed 00:31:11.509 Test: blockdev reset ...passed 00:31:11.509 Test: blockdev write read 8 blocks ...passed 00:31:11.509 Test: blockdev write read size > 128k ...passed 00:31:11.509 Test: blockdev write read invalid size ...passed 00:31:11.509 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:11.509 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:11.509 Test: blockdev write read max offset ...passed 00:31:11.509 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:11.509 Test: blockdev writev readv 8 blocks ...passed 00:31:11.509 Test: blockdev writev readv 30 x 1block ...passed 00:31:11.510 Test: blockdev writev readv block ...passed 00:31:11.510 Test: blockdev writev readv size > 128k ...passed 00:31:11.510 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:11.510 Test: blockdev comparev and writev ...passed 00:31:11.510 Test: blockdev nvme passthru rw ...passed 00:31:11.510 Test: blockdev nvme passthru vendor specific ...passed 00:31:11.510 Test: blockdev nvme admin passthru ...passed 00:31:11.510 Test: blockdev copy ...passed 00:31:11.510 Suite: bdevio tests on: crypto_ram3 00:31:11.510 Test: blockdev write read block ...passed 00:31:11.510 Test: blockdev write zeroes read block ...passed 00:31:11.510 Test: blockdev write zeroes read no split ...passed 00:31:11.510 Test: blockdev write zeroes read split ...passed 00:31:11.510 Test: blockdev write zeroes read split partial ...passed 00:31:11.510 Test: blockdev reset ...passed 00:31:11.510 Test: blockdev write read 8 blocks ...passed 00:31:11.510 Test: blockdev write read size > 128k ...passed 00:31:11.510 Test: blockdev write read invalid size ...passed 00:31:11.510 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:11.510 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:11.510 Test: blockdev write read max offset ...passed 00:31:11.510 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:11.510 Test: blockdev writev readv 8 blocks ...passed 00:31:11.510 Test: blockdev writev readv 30 x 1block ...passed 00:31:11.510 Test: blockdev writev readv block ...passed 00:31:11.510 Test: blockdev writev readv size > 128k ...passed 00:31:11.510 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:11.510 Test: blockdev comparev and writev ...passed 00:31:11.510 Test: blockdev nvme passthru rw ...passed 00:31:11.510 Test: blockdev nvme passthru vendor specific ...passed 00:31:11.510 Test: blockdev nvme admin passthru ...passed 00:31:11.510 Test: blockdev copy ...passed 00:31:11.510 Suite: bdevio tests on: crypto_ram2 00:31:11.510 Test: blockdev write read block ...passed 00:31:11.510 Test: blockdev write zeroes read block ...passed 00:31:11.510 Test: blockdev write zeroes read no split ...passed 00:31:11.510 Test: blockdev write zeroes read split ...passed 00:31:11.510 Test: blockdev write zeroes read split partial ...passed 00:31:11.510 Test: blockdev reset ...passed 00:31:11.510 Test: blockdev write read 8 blocks ...passed 00:31:11.510 Test: blockdev write read size > 128k ...passed 00:31:11.510 Test: blockdev write read invalid size ...passed 00:31:11.510 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:11.510 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:11.510 Test: blockdev write read max offset ...passed 00:31:11.510 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:11.510 Test: blockdev writev readv 8 blocks ...passed 00:31:11.510 Test: blockdev writev readv 30 x 1block ...passed 00:31:11.510 Test: blockdev writev readv block ...passed 00:31:11.510 Test: blockdev writev readv size > 128k ...passed 00:31:11.510 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:11.510 Test: blockdev comparev and writev ...passed 00:31:11.510 Test: blockdev nvme passthru rw ...passed 00:31:11.510 Test: blockdev nvme passthru vendor specific ...passed 00:31:11.510 Test: blockdev nvme admin passthru ...passed 00:31:11.510 Test: blockdev copy ...passed 00:31:11.510 Suite: bdevio tests on: crypto_ram 00:31:11.510 Test: blockdev write read block ...passed 00:31:11.510 Test: blockdev write zeroes read block ...passed 00:31:11.510 Test: blockdev write zeroes read no split ...passed 00:31:11.768 Test: blockdev write zeroes read split ...passed 00:31:11.768 Test: blockdev write zeroes read split partial ...passed 00:31:11.768 Test: blockdev reset ...passed 00:31:11.768 Test: blockdev write read 8 blocks ...passed 00:31:11.768 Test: blockdev write read size > 128k ...passed 00:31:11.768 Test: blockdev write read invalid size ...passed 00:31:11.768 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:11.768 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:11.768 Test: blockdev write read max offset ...passed 00:31:11.768 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:11.768 Test: blockdev writev readv 8 blocks ...passed 00:31:11.768 Test: blockdev writev readv 30 x 1block ...passed 00:31:11.768 Test: blockdev writev readv block ...passed 00:31:11.768 Test: blockdev writev readv size > 128k ...passed 00:31:11.768 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:11.768 Test: blockdev comparev and writev ...passed 00:31:11.768 Test: blockdev nvme passthru rw ...passed 00:31:11.768 Test: blockdev nvme passthru vendor specific ...passed 00:31:11.768 Test: blockdev nvme admin passthru ...passed 00:31:11.768 Test: blockdev copy ...passed 00:31:11.768 00:31:11.768 Run Summary: Type Total Ran Passed Failed Inactive 00:31:11.768 suites 4 4 n/a 0 0 00:31:11.768 tests 92 92 92 0 0 00:31:11.768 asserts 520 520 520 0 n/a 00:31:11.768 00:31:11.768 Elapsed time = 0.537 seconds 00:31:11.768 0 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2195598 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2195598 ']' 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2195598 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2195598 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2195598' 00:31:11.768 killing process with pid 2195598 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2195598 00:31:11.768 10:57:46 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2195598 00:31:12.026 10:57:47 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:31:12.026 00:31:12.026 real 0m3.619s 00:31:12.026 user 0m10.020s 00:31:12.026 sys 0m0.587s 00:31:12.026 10:57:47 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:12.026 10:57:47 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:12.026 ************************************ 00:31:12.026 END TEST bdev_bounds 00:31:12.026 ************************************ 00:31:12.285 10:57:47 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:12.285 10:57:47 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:31:12.285 10:57:47 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:31:12.285 10:57:47 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:12.285 10:57:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:12.285 ************************************ 00:31:12.285 START TEST bdev_nbd 00:31:12.285 ************************************ 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2195995 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2195995 /var/tmp/spdk-nbd.sock 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2195995 ']' 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:31:12.285 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:12.285 10:57:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:12.285 [2024-07-12 10:57:47.357504] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:31:12.285 [2024-07-12 10:57:47.357568] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:12.543 [2024-07-12 10:57:47.488859] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:12.543 [2024-07-12 10:57:47.594909] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:12.543 [2024-07-12 10:57:47.616196] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:12.543 [2024-07-12 10:57:47.624216] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:12.543 [2024-07-12 10:57:47.632234] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:12.801 [2024-07-12 10:57:47.745148] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:15.332 [2024-07-12 10:57:49.964499] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:15.332 [2024-07-12 10:57:49.964558] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:15.332 [2024-07-12 10:57:49.964573] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:15.332 [2024-07-12 10:57:49.972521] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:15.332 [2024-07-12 10:57:49.972540] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:15.332 [2024-07-12 10:57:49.972551] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:15.332 [2024-07-12 10:57:49.980537] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:15.332 [2024-07-12 10:57:49.980555] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:15.332 [2024-07-12 10:57:49.980567] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:15.332 [2024-07-12 10:57:49.988558] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:15.332 [2024-07-12 10:57:49.988574] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:15.332 [2024-07-12 10:57:49.988586] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:15.332 1+0 records in 00:31:15.332 1+0 records out 00:31:15.332 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000206161 s, 19.9 MB/s 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:15.332 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:15.590 1+0 records in 00:31:15.590 1+0 records out 00:31:15.590 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000354844 s, 11.5 MB/s 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:31:15.590 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:15.849 1+0 records in 00:31:15.849 1+0 records out 00:31:15.849 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311682 s, 13.1 MB/s 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:15.849 10:57:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:16.107 1+0 records in 00:31:16.107 1+0 records out 00:31:16.107 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000355284 s, 11.5 MB/s 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:16.107 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:31:16.365 { 00:31:16.365 "nbd_device": "/dev/nbd0", 00:31:16.365 "bdev_name": "crypto_ram" 00:31:16.365 }, 00:31:16.365 { 00:31:16.365 "nbd_device": "/dev/nbd1", 00:31:16.365 "bdev_name": "crypto_ram2" 00:31:16.365 }, 00:31:16.365 { 00:31:16.365 "nbd_device": "/dev/nbd2", 00:31:16.365 "bdev_name": "crypto_ram3" 00:31:16.365 }, 00:31:16.365 { 00:31:16.365 "nbd_device": "/dev/nbd3", 00:31:16.365 "bdev_name": "crypto_ram4" 00:31:16.365 } 00:31:16.365 ]' 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:31:16.365 { 00:31:16.365 "nbd_device": "/dev/nbd0", 00:31:16.365 "bdev_name": "crypto_ram" 00:31:16.365 }, 00:31:16.365 { 00:31:16.365 "nbd_device": "/dev/nbd1", 00:31:16.365 "bdev_name": "crypto_ram2" 00:31:16.365 }, 00:31:16.365 { 00:31:16.365 "nbd_device": "/dev/nbd2", 00:31:16.365 "bdev_name": "crypto_ram3" 00:31:16.365 }, 00:31:16.365 { 00:31:16.365 "nbd_device": "/dev/nbd3", 00:31:16.365 "bdev_name": "crypto_ram4" 00:31:16.365 } 00:31:16.365 ]' 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:16.365 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:16.624 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:16.882 10:57:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:17.141 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:17.398 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:17.656 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:31:17.914 /dev/nbd0 00:31:17.914 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:31:17.914 10:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:31:17.914 10:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:17.914 10:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:17.914 10:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:17.914 10:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:17.914 10:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:17.914 1+0 records in 00:31:17.914 1+0 records out 00:31:17.914 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225825 s, 18.1 MB/s 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:17.914 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:31:18.172 /dev/nbd1 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:18.172 1+0 records in 00:31:18.172 1+0 records out 00:31:18.172 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290121 s, 14.1 MB/s 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:18.172 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:31:18.432 /dev/nbd10 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:18.432 1+0 records in 00:31:18.432 1+0 records out 00:31:18.432 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000359268 s, 11.4 MB/s 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:18.432 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:31:18.692 /dev/nbd11 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:18.692 1+0 records in 00:31:18.692 1+0 records out 00:31:18.692 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000384352 s, 10.7 MB/s 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:18.692 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:18.950 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:31:18.950 { 00:31:18.950 "nbd_device": "/dev/nbd0", 00:31:18.950 "bdev_name": "crypto_ram" 00:31:18.950 }, 00:31:18.950 { 00:31:18.950 "nbd_device": "/dev/nbd1", 00:31:18.950 "bdev_name": "crypto_ram2" 00:31:18.950 }, 00:31:18.950 { 00:31:18.950 "nbd_device": "/dev/nbd10", 00:31:18.950 "bdev_name": "crypto_ram3" 00:31:18.950 }, 00:31:18.950 { 00:31:18.950 "nbd_device": "/dev/nbd11", 00:31:18.950 "bdev_name": "crypto_ram4" 00:31:18.950 } 00:31:18.950 ]' 00:31:18.950 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:31:18.950 { 00:31:18.950 "nbd_device": "/dev/nbd0", 00:31:18.950 "bdev_name": "crypto_ram" 00:31:18.950 }, 00:31:18.950 { 00:31:18.950 "nbd_device": "/dev/nbd1", 00:31:18.950 "bdev_name": "crypto_ram2" 00:31:18.950 }, 00:31:18.950 { 00:31:18.950 "nbd_device": "/dev/nbd10", 00:31:18.950 "bdev_name": "crypto_ram3" 00:31:18.950 }, 00:31:18.950 { 00:31:18.950 "nbd_device": "/dev/nbd11", 00:31:18.950 "bdev_name": "crypto_ram4" 00:31:18.950 } 00:31:18.950 ]' 00:31:18.950 10:57:53 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:18.950 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:31:18.951 /dev/nbd1 00:31:18.951 /dev/nbd10 00:31:18.951 /dev/nbd11' 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:31:18.951 /dev/nbd1 00:31:18.951 /dev/nbd10 00:31:18.951 /dev/nbd11' 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:31:18.951 256+0 records in 00:31:18.951 256+0 records out 00:31:18.951 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0101593 s, 103 MB/s 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:31:18.951 256+0 records in 00:31:18.951 256+0 records out 00:31:18.951 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0619271 s, 16.9 MB/s 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:18.951 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:31:19.209 256+0 records in 00:31:19.209 256+0 records out 00:31:19.209 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0650315 s, 16.1 MB/s 00:31:19.209 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:19.209 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:31:19.209 256+0 records in 00:31:19.209 256+0 records out 00:31:19.209 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0608693 s, 17.2 MB/s 00:31:19.209 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:19.209 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:31:19.209 256+0 records in 00:31:19.209 256+0 records out 00:31:19.209 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0578554 s, 18.1 MB/s 00:31:19.209 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:31:19.209 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:19.209 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:19.210 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:19.468 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:19.726 10:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:31:19.984 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:31:19.984 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:31:19.985 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:31:19.985 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:19.985 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:19.985 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:31:19.985 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:19.985 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:19.985 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:19.985 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:20.243 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:20.501 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:20.501 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:20.501 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:31:20.760 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:31:21.018 malloc_lvol_verify 00:31:21.018 10:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:31:21.018 c66b01ae-1944-4499-8053-f83309a57c51 00:31:21.276 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:31:21.276 75ab554c-c8f8-4105-ac0c-7f90c09f0da4 00:31:21.277 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:31:21.535 /dev/nbd0 00:31:21.535 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:31:21.535 mke2fs 1.46.5 (30-Dec-2021) 00:31:21.535 Discarding device blocks: 0/4096 done 00:31:21.535 Creating filesystem with 4096 1k blocks and 1024 inodes 00:31:21.535 00:31:21.535 Allocating group tables: 0/1 done 00:31:21.535 Writing inode tables: 0/1 done 00:31:21.535 Creating journal (1024 blocks): done 00:31:21.535 Writing superblocks and filesystem accounting information: 0/1 done 00:31:21.535 00:31:21.535 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:31:21.535 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:31:21.535 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:21.535 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:31:21.535 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:21.535 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2195995 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2195995 ']' 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2195995 00:31:21.793 10:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:31:22.052 10:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:22.052 10:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2195995 00:31:22.052 10:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:22.052 10:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:22.052 10:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2195995' 00:31:22.052 killing process with pid 2195995 00:31:22.052 10:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2195995 00:31:22.052 10:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2195995 00:31:22.310 10:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:31:22.310 00:31:22.310 real 0m10.133s 00:31:22.310 user 0m13.094s 00:31:22.310 sys 0m4.088s 00:31:22.310 10:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:22.310 10:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:22.310 ************************************ 00:31:22.310 END TEST bdev_nbd 00:31:22.310 ************************************ 00:31:22.310 10:57:57 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:22.310 10:57:57 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:31:22.310 10:57:57 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:31:22.310 10:57:57 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:31:22.310 10:57:57 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:31:22.310 10:57:57 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:22.310 10:57:57 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:22.310 10:57:57 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:22.310 ************************************ 00:31:22.310 START TEST bdev_fio 00:31:22.310 ************************************ 00:31:22.310 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:31:22.310 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:31:22.310 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:22.310 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:22.310 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:31:22.310 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:31:22.310 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:22.572 ************************************ 00:31:22.572 START TEST bdev_fio_rw_verify 00:31:22.572 ************************************ 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:22.572 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:22.573 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:22.573 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:22.573 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:22.573 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:22.573 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:22.573 10:57:57 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:22.879 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:22.879 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:22.879 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:22.879 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:22.879 fio-3.35 00:31:22.879 Starting 4 threads 00:31:37.752 00:31:37.752 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2198029: Fri Jul 12 10:58:10 2024 00:31:37.752 read: IOPS=20.3k, BW=79.3MiB/s (83.1MB/s)(793MiB/10001msec) 00:31:37.752 slat (usec): min=17, max=1404, avg=64.90, stdev=49.55 00:31:37.752 clat (usec): min=10, max=2643, avg=350.28, stdev=296.83 00:31:37.752 lat (usec): min=29, max=2876, avg=415.18, stdev=334.37 00:31:37.752 clat percentiles (usec): 00:31:37.752 | 50.000th=[ 260], 99.000th=[ 1516], 99.900th=[ 1762], 99.990th=[ 1844], 00:31:37.752 | 99.999th=[ 1876] 00:31:37.752 write: IOPS=22.3k, BW=87.1MiB/s (91.3MB/s)(848MiB/9736msec); 0 zone resets 00:31:37.752 slat (usec): min=22, max=288, avg=80.03, stdev=49.49 00:31:37.752 clat (usec): min=42, max=2361, avg=434.62, stdev=348.18 00:31:37.752 lat (usec): min=91, max=2585, avg=514.65, stdev=385.30 00:31:37.752 clat percentiles (usec): 00:31:37.752 | 50.000th=[ 347], 99.000th=[ 1663], 99.900th=[ 2212], 99.990th=[ 2278], 00:31:37.752 | 99.999th=[ 2311] 00:31:37.752 bw ( KiB/s): min=68936, max=120104, per=98.06%, avg=87451.42, stdev=3582.75, samples=76 00:31:37.752 iops : min=17234, max=30026, avg=21862.58, stdev=895.73, samples=76 00:31:37.752 lat (usec) : 20=0.01%, 50=0.01%, 100=7.14%, 250=32.44%, 500=38.62% 00:31:37.752 lat (usec) : 750=9.67%, 1000=5.36% 00:31:37.752 lat (msec) : 2=6.44%, 4=0.33% 00:31:37.752 cpu : usr=99.60%, sys=0.00%, ctx=49, majf=0, minf=302 00:31:37.752 IO depths : 1=10.6%, 2=25.4%, 4=50.9%, 8=13.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:37.752 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:37.752 complete : 0=0.0%, 4=88.8%, 8=11.2%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:37.752 issued rwts: total=202925,217057,0,0 short=0,0,0,0 dropped=0,0,0,0 00:31:37.752 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:37.752 00:31:37.752 Run status group 0 (all jobs): 00:31:37.752 READ: bw=79.3MiB/s (83.1MB/s), 79.3MiB/s-79.3MiB/s (83.1MB/s-83.1MB/s), io=793MiB (831MB), run=10001-10001msec 00:31:37.752 WRITE: bw=87.1MiB/s (91.3MB/s), 87.1MiB/s-87.1MiB/s (91.3MB/s-91.3MB/s), io=848MiB (889MB), run=9736-9736msec 00:31:37.752 00:31:37.752 real 0m13.455s 00:31:37.752 user 0m46.000s 00:31:37.752 sys 0m0.456s 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:31:37.752 ************************************ 00:31:37.752 END TEST bdev_fio_rw_verify 00:31:37.752 ************************************ 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:31:37.752 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "9be9f94c-ece4-5bed-bbd4-8f24e4fbb91c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9be9f94c-ece4-5bed-bbd4-8f24e4fbb91c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "d33ea4c5-a3bc-524b-9a7f-5da4206c0870"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d33ea4c5-a3bc-524b-9a7f-5da4206c0870",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ee70e39d-9ac7-5843-99b7-b4d98ef14387"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ee70e39d-9ac7-5843-99b7-b4d98ef14387",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "7a1b0f70-0eea-52de-a31c-c4ae6d196919"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "7a1b0f70-0eea-52de-a31c-c4ae6d196919",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:31:37.753 crypto_ram2 00:31:37.753 crypto_ram3 00:31:37.753 crypto_ram4 ]] 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "9be9f94c-ece4-5bed-bbd4-8f24e4fbb91c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9be9f94c-ece4-5bed-bbd4-8f24e4fbb91c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "d33ea4c5-a3bc-524b-9a7f-5da4206c0870"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d33ea4c5-a3bc-524b-9a7f-5da4206c0870",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ee70e39d-9ac7-5843-99b7-b4d98ef14387"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ee70e39d-9ac7-5843-99b7-b4d98ef14387",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "7a1b0f70-0eea-52de-a31c-c4ae6d196919"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "7a1b0f70-0eea-52de-a31c-c4ae6d196919",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:37.753 ************************************ 00:31:37.753 START TEST bdev_fio_trim 00:31:37.753 ************************************ 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:37.753 10:58:11 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:37.753 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:37.753 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:37.753 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:37.753 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:37.753 fio-3.35 00:31:37.753 Starting 4 threads 00:31:49.953 00:31:49.953 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2199883: Fri Jul 12 10:58:24 2024 00:31:49.953 write: IOPS=34.9k, BW=137MiB/s (143MB/s)(1365MiB/10001msec); 0 zone resets 00:31:49.953 slat (usec): min=16, max=532, avg=65.04, stdev=36.76 00:31:49.953 clat (usec): min=53, max=2259, avg=290.15, stdev=185.14 00:31:49.953 lat (usec): min=70, max=2630, avg=355.19, stdev=208.35 00:31:49.953 clat percentiles (usec): 00:31:49.953 | 50.000th=[ 243], 99.000th=[ 938], 99.900th=[ 1172], 99.990th=[ 1336], 00:31:49.953 | 99.999th=[ 2024] 00:31:49.953 bw ( KiB/s): min=126752, max=172760, per=100.00%, avg=140358.32, stdev=4257.08, samples=76 00:31:49.953 iops : min=31688, max=43190, avg=35089.58, stdev=1064.27, samples=76 00:31:49.953 trim: IOPS=34.9k, BW=137MiB/s (143MB/s)(1365MiB/10001msec); 0 zone resets 00:31:49.953 slat (usec): min=6, max=1341, avg=18.42, stdev= 7.80 00:31:49.953 clat (usec): min=10, max=1735, avg=273.55, stdev=123.70 00:31:49.953 lat (usec): min=33, max=1761, avg=291.98, stdev=126.29 00:31:49.953 clat percentiles (usec): 00:31:49.953 | 50.000th=[ 253], 99.000th=[ 652], 99.900th=[ 791], 99.990th=[ 906], 00:31:49.953 | 99.999th=[ 1516] 00:31:49.953 bw ( KiB/s): min=126760, max=172784, per=100.00%, avg=140359.58, stdev=4257.52, samples=76 00:31:49.953 iops : min=31690, max=43196, avg=35089.89, stdev=1064.38, samples=76 00:31:49.953 lat (usec) : 20=0.01%, 100=5.45%, 250=45.21%, 500=40.35%, 750=7.17% 00:31:49.953 lat (usec) : 1000=1.54% 00:31:49.953 lat (msec) : 2=0.28%, 4=0.01% 00:31:49.953 cpu : usr=99.58%, sys=0.00%, ctx=81, majf=0, minf=106 00:31:49.953 IO depths : 1=7.9%, 2=26.3%, 4=52.7%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:49.953 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:49.953 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:49.953 issued rwts: total=0,349510,349512,0 short=0,0,0,0 dropped=0,0,0,0 00:31:49.953 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:49.953 00:31:49.953 Run status group 0 (all jobs): 00:31:49.953 WRITE: bw=137MiB/s (143MB/s), 137MiB/s-137MiB/s (143MB/s-143MB/s), io=1365MiB (1432MB), run=10001-10001msec 00:31:49.953 TRIM: bw=137MiB/s (143MB/s), 137MiB/s-137MiB/s (143MB/s-143MB/s), io=1365MiB (1432MB), run=10001-10001msec 00:31:49.953 00:31:49.953 real 0m13.542s 00:31:49.953 user 0m45.945s 00:31:49.953 sys 0m0.509s 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:31:49.953 ************************************ 00:31:49.953 END TEST bdev_fio_trim 00:31:49.953 ************************************ 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:31:49.953 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:31:49.953 00:31:49.953 real 0m27.333s 00:31:49.953 user 1m32.120s 00:31:49.953 sys 0m1.147s 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:49.953 ************************************ 00:31:49.953 END TEST bdev_fio 00:31:49.953 ************************************ 00:31:49.953 10:58:24 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:49.953 10:58:24 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:49.953 10:58:24 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:49.953 10:58:24 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:31:49.953 10:58:24 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:49.953 10:58:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:49.953 ************************************ 00:31:49.953 START TEST bdev_verify 00:31:49.953 ************************************ 00:31:49.953 10:58:24 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:49.953 [2024-07-12 10:58:24.973335] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:31:49.953 [2024-07-12 10:58:24.973399] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2201307 ] 00:31:49.953 [2024-07-12 10:58:25.102839] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:50.212 [2024-07-12 10:58:25.201399] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:50.212 [2024-07-12 10:58:25.201405] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:50.212 [2024-07-12 10:58:25.222771] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:50.212 [2024-07-12 10:58:25.230800] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:50.212 [2024-07-12 10:58:25.238833] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:50.212 [2024-07-12 10:58:25.338143] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:52.743 [2024-07-12 10:58:27.576532] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:52.743 [2024-07-12 10:58:27.576609] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:52.743 [2024-07-12 10:58:27.576624] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:52.743 [2024-07-12 10:58:27.584543] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:52.743 [2024-07-12 10:58:27.584562] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:52.743 [2024-07-12 10:58:27.584573] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:52.743 [2024-07-12 10:58:27.592564] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:52.743 [2024-07-12 10:58:27.592581] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:52.743 [2024-07-12 10:58:27.592593] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:52.743 [2024-07-12 10:58:27.600593] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:52.743 [2024-07-12 10:58:27.600611] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:52.743 [2024-07-12 10:58:27.600623] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:52.743 Running I/O for 5 seconds... 00:31:58.002 00:31:58.002 Latency(us) 00:31:58.002 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:58.002 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:58.002 Verification LBA range: start 0x0 length 0x1000 00:31:58.002 crypto_ram : 5.08 504.40 1.97 0.00 0.00 253097.46 5157.40 177802.02 00:31:58.002 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:58.002 Verification LBA range: start 0x1000 length 0x1000 00:31:58.002 crypto_ram : 5.08 504.36 1.97 0.00 0.00 253094.66 5898.24 176890.21 00:31:58.002 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:58.002 Verification LBA range: start 0x0 length 0x1000 00:31:58.002 crypto_ram2 : 5.08 504.12 1.97 0.00 0.00 252366.70 5242.88 168683.97 00:31:58.002 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:58.002 Verification LBA range: start 0x1000 length 0x1000 00:31:58.002 crypto_ram2 : 5.08 503.97 1.97 0.00 0.00 252373.72 6012.22 167772.16 00:31:58.002 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:58.002 Verification LBA range: start 0x0 length 0x1000 00:31:58.002 crypto_ram3 : 5.06 3894.37 15.21 0.00 0.00 32554.96 5271.37 28151.99 00:31:58.002 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:58.002 Verification LBA range: start 0x1000 length 0x1000 00:31:58.002 crypto_ram3 : 5.06 3916.21 15.30 0.00 0.00 32376.68 4502.04 28038.01 00:31:58.002 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:58.002 Verification LBA range: start 0x0 length 0x1000 00:31:58.002 crypto_ram4 : 5.06 3894.87 15.21 0.00 0.00 32457.48 5470.83 25530.55 00:31:58.002 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:58.002 Verification LBA range: start 0x1000 length 0x1000 00:31:58.002 crypto_ram4 : 5.07 3916.77 15.30 0.00 0.00 32276.74 4673.00 25302.59 00:31:58.002 =================================================================================================================== 00:31:58.002 Total : 17639.07 68.90 0.00 0.00 57671.35 4502.04 177802.02 00:31:58.259 00:31:58.259 real 0m8.312s 00:31:58.259 user 0m15.734s 00:31:58.259 sys 0m0.400s 00:31:58.259 10:58:33 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:58.259 10:58:33 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:31:58.259 ************************************ 00:31:58.259 END TEST bdev_verify 00:31:58.259 ************************************ 00:31:58.259 10:58:33 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:58.259 10:58:33 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:58.259 10:58:33 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:31:58.259 10:58:33 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:58.259 10:58:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:58.259 ************************************ 00:31:58.259 START TEST bdev_verify_big_io 00:31:58.259 ************************************ 00:31:58.259 10:58:33 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:58.259 [2024-07-12 10:58:33.368828] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:31:58.259 [2024-07-12 10:58:33.368888] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2202363 ] 00:31:58.517 [2024-07-12 10:58:33.496703] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:58.517 [2024-07-12 10:58:33.598508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:58.517 [2024-07-12 10:58:33.598514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:58.517 [2024-07-12 10:58:33.619862] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:58.517 [2024-07-12 10:58:33.627891] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:58.517 [2024-07-12 10:58:33.635921] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:58.774 [2024-07-12 10:58:33.743987] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:01.374 [2024-07-12 10:58:35.970869] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:01.374 [2024-07-12 10:58:35.970945] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:01.374 [2024-07-12 10:58:35.970959] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:01.374 [2024-07-12 10:58:35.978886] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:01.374 [2024-07-12 10:58:35.978905] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:01.374 [2024-07-12 10:58:35.978917] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:01.374 [2024-07-12 10:58:35.986908] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:01.374 [2024-07-12 10:58:35.986925] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:01.374 [2024-07-12 10:58:35.986936] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:01.374 [2024-07-12 10:58:35.994928] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:01.374 [2024-07-12 10:58:35.994945] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:01.374 [2024-07-12 10:58:35.994956] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:01.374 Running I/O for 5 seconds... 00:32:01.945 [2024-07-12 10:58:36.878358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:01.945 [2024-07-12 10:58:36.878813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:01.945 [2024-07-12 10:58:36.879002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:01.945 [2024-07-12 10:58:36.879069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:01.945 [2024-07-12 10:58:36.879118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:01.945 [2024-07-12 10:58:36.879452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.880566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.880620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.880676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.880718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.881280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.881338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.881407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.881448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.881815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.882928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.882999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.883054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.883096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.883550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.883600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.883642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.883688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.884156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.885226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.885278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.885320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.885360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.885875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.885949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.885994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.945 [2024-07-12 10:58:36.886036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.886395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.887594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.887653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.887706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.887778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.888290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.888341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.888384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.888428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.888855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.889890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.889952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.889994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.890035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.890578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.890649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.890706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.890747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.891135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.892302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.892354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.892399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.892439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.892894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.892942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.892983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.893025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.893327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.894505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.894556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.894597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.894639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.895131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.895180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.895221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.895262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.895608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.896700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.896769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.896812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.896857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.897352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.897419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.897461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.897522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.897926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.899199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.899248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.899288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.899329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.899756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.899817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.899864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.899905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.900236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.901295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.901347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.901389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.901431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.901968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.902016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.902069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.902116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.902497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.904187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.904248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.904308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.904349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.904838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.904897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.904965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.905019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.905376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.906325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.906381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.906423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.906465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.906972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.907023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.907077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.907120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.907557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.909034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.909084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.909147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.909188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.909705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.909752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.909794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.909834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.910136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.911188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.911252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.911308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.911350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.911864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.946 [2024-07-12 10:58:36.911915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.911956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.912017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.912472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.913527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.913579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.913620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.913675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.914257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.914306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.914347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.914387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.914684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.915851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.915905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.915946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.915988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.916401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.916452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.916503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.916544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.916888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.917936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.917988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.918031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.918072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.918683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.918730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.918786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.918827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.919262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.920529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.920580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.920621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.920662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.921102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.921149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.921204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.921259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.921620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.922719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.922770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.922814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.922855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.923339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.923386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.923428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.923473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.923833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.924942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.924996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.925056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.925098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.925605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.925669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.925736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.925785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.926174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.927362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.927425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.927471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.927534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.927986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.928059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.928114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.928156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.928523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.929522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.929573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.929614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.929655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.930140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.930186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.930230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.930272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.930657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.931929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.931992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.932039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.932093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.932575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.932623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.932664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.932704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.933025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.934072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.934142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.934206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.947 [2024-07-12 10:58:36.934249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.934749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.934799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.934859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.934900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.935303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.936383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.936435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.936509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.936552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.937087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.937135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.937177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.937218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.937492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.938658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.938708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.938749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.938789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.939246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.939304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.939346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.939386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.939760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.940792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.940842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.940898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.940939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.941476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.941534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.941580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.941620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.941881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.942825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.942880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.942922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.942962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.943423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.943472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.943527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.943569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.943833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.944829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.944883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.944924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.944966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.945549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.945597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.945644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.945689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.945952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.946877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.946929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.946970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.947010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.947392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.947446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.947494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.947540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.947800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.948820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.948880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.948921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.949478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.949532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.949573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.950754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.952446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.953584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.954808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.955231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.956545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.957647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.958033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.960624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.961937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.962791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.964120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.965978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.967401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.967815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.968197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.970787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.972389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.973548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.974829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.976523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.977780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.978182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.948 [2024-07-12 10:58:36.978570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.981087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.981979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.983402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.984986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.986629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.987178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.987569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.988496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.991031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.991952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.993231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.994539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.996384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.996783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.997169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:36.998679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.001041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.002676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.004140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.005487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.006706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.007100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.007760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.009035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.011031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.012318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.013621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.014979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.015717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.016107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.017368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.018640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.021068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.022352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.023657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.024964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.025781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.026208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.027622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.029178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.031752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.033256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.034887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.036461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.037336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.038430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.039709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.041010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.043394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.044695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.045998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.047063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.047990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.049576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.051213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.052713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.055609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.057198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.058650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.059043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.060403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.061691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.063002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.064312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.066729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.068050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.069150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.069547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.071660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.073269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.074747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.076096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.078753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.080190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.080590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.080978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.082589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.083897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.085204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.086538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.088991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.090211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.090611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.091002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.092748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.094054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.095351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.096173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.098597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.099182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.099574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.100445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.102152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.103562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.105114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.106313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.108912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.109306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.109701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.949 [2024-07-12 10:58:37.111215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.112918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.114216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.115125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.116608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.118618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.119014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.119716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.120999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.122933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.124621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.125715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.126983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.128494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.128890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.130296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.131566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.133267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.134263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:01.950 [2024-07-12 10:58:37.135822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.137446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.139012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.139658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.140929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.142282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.144119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.145079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.146358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.147633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.149189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.150493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.151768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.153073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.154591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.156259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.157768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.159138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.160983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.162290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.163736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.165313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.166493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.167774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.169077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.170377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.172770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.174036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.175322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.176615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.178453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.179842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.181140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.182450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.185374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.186950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.188588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.190085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.191750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.193055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.194416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.195920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.198419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.199729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.201042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.202172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.203824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.205137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.206444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.207188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.210145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.211753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.213226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.214118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.215832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.217177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.218640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.219031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.221546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.222858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.224084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.225600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.227333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.228652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.229506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.229891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.232415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.233729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.234565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.235837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.213 [2024-07-12 10:58:37.237819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.239427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.239837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.240224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.242667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.244084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.244733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.246027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.247964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.249643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.250039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.250423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.252834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.254276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.255336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.256619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.258261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.259662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.260055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.260440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.263275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.263696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.264081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.264469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.265290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.265693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.266081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.266473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.268477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.268878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.269270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.269659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.270454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.270865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.271252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.271647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.273554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.273951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.274341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.274385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.274776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.275215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.275712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.276109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.276504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.276890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.278333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.278385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.278430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.278473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.278873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.279369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.279425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.279467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.279525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.280845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.280897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.280940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.280994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.281363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.281526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.281587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.281629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.281683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.283399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.283452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.283511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.283557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.283908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.284062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.284112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.284177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.284220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.285493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.285557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.285600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.285642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.286005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.286157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.286208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.286250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.286306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.288005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.288054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.288096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.288149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.288528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.288679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.288730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.288771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.288811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.290164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.290228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.290269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.290325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.290739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.290887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.290934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.290976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.214 [2024-07-12 10:58:37.291017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.292462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.292529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.292572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.292612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.293067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.293208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.293265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.293306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.293347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.294738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.294798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.294841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.294887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.295296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.295442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.295495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.295537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.295581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.297045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.297096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.297137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.297179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.297558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.297704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.297754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.297796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.297838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.299251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.299303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.299353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.299397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.299836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.299987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.300033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.300089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.300130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.301739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.301793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.301834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.301875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.302150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.302301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.302372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.302425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.302468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.303858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.303909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.303950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.303995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.304364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.304522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.304568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.304609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.304664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.306237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.306315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.306357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.306423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.306783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.306932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.307007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.307049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.307104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.308495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.308552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.308619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.308663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.309056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.309206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.309253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.309303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.309344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.310872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.310923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.310965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.311006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.311374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.311533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.311580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.311622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.311663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.313100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.313157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.313210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.313285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.313712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.313859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.313906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.313948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.313989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.315439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.315498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.315540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.315581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.316070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.316219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.316269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.316310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.215 [2024-07-12 10:58:37.316351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.317753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.317804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.317849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.317890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.318197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.318340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.318385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.318427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.318468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.319872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.319924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.319967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.320008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.320502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.320651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.320709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.320751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.320807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.322320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.322372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.322413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.322457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.322747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.322894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.322960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.323001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.323042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.324649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.324713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.324754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.324803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.325065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.325215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.325277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.325340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.325380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.326577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.326644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.326686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.326726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.326987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.327129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.327187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.327229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.327270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.328733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.328790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.328832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.328873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.329208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.329353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.329398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.329438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.329492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.330845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.330895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.330936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.330979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.331290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.331435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.331479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.331527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.331576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.332818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.332871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.332916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.332957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.333216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.333362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.333410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.333451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.333498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.335444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.335500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.335541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.335582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.335889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.336038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.336082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.336123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.336164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.337331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.337386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.337428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.337469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.337770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.337920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.337965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.338005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.338045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.339210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.339261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.339303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.339347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.339668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.339813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.339860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.339900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.339941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.216 [2024-07-12 10:58:37.341162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.341213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.341253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.341293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.341563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.341717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.341768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.341811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.341852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.343019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.343068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.343108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.343149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.343413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.343566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.343618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.343661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.343703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.345009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.345071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.345119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.345161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.345433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.345588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.345635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.345684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.345724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.346889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.346939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.348225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.348485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.348631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.348677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.348718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.348759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.350146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.350907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.352188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.353492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.353761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.353907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.355091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.356469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.357743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.359215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.359615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.361245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.362775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.363092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.364723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.365373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.366639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.368013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.369562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.370494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.371774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.373073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.373353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.374621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.376020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.377291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.378590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.380294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.381880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.383546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.385045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.385312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.386053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.387317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.388626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.390147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.392241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.393527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.394835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.396364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.396729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.398352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.399778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.401080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.217 [2024-07-12 10:58:37.402594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.405268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.406591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.408126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.409241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.409517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.410892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.412208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.413776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.414213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.416868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.418433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.419900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.420977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.421328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.422725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.424257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.425266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.425653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.428223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.429745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.430396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.431838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.432110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.433694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.435317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.435714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.436097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.438683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.440049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.441230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.442506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.442817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.444448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.445351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.445741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.446128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.448766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.449417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.450759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.452256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.452529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.454292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.454692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.455079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.456264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.458668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.459968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.461249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.462553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.462822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.463763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.464151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.464664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.465983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.467715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.468999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.470425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.471989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.472258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.472748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.473139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.474403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.475677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.479 [2024-07-12 10:58:37.478049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.479332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.480645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.482158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.482545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.483035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.483648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.484930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.486274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.488718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.490033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.491558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.492906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.493312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.493803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.495240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.496525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.497826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.500232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.501531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.503048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.503586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.503929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.504719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.505987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.507281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.508809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.511242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.512781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.514070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.514454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.514928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.516462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.517793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.519090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.520620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.523108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.524634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.525133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.525519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.525820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.527187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.528500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.530020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.531324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.533952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.535203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.535593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.535991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.536259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.537649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.538950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.540475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.541115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.543763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.544224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.544616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.545491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.545825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.547223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.548754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.549987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.551278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.553679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.554074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.554474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.556164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.556436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.557827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.559356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.559996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.561276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.562774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.563170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.480 [2024-07-12 10:58:37.564125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.565403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.565709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.567338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.568492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.569890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.571165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.572658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.573050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.574634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.576228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.576531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.578158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.578825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.580092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.581417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.582995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.584056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.585315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.586605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.586874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.588073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.589583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.590961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.592241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.594231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.595717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.597334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.598925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.599199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.600060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.601343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.602644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.604161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.606440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.607730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.609035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.610562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.610885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.612558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.614006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.615306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.616828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.619727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.621064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.621554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.623149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.623420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.624009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.624402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.625242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.626461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.628879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.630294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.631908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.632296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.632666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.633155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.633560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.633960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.634346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.636033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.636430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.636827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.637228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.637705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.638192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.638596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.638989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.639375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.641157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.641558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.641949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.642343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.642766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.643256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.643655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.644044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.644440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.481 [2024-07-12 10:58:37.646402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.646805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.647191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.647583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.647947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.648432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.648830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.649215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.649606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.651248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.651659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.652046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.652434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.652895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.653384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.653791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.654177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.654572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.656240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.656648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.657041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.657090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.657512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.658003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.658397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.658790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.659175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.660980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.661035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.661078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.661120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.661436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.661942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.662007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.662051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.662102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.663516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.663567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.663608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.663650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.664026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.664177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.664235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.664276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.664317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.665790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.665842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.665896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.665940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.666284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.666437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.666490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.666533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.666575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.668045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.668099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.668142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.668190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.668666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.668824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.668870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.668912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.668952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.670377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.670429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.670475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.482 [2024-07-12 10:58:37.670522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.670903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.671050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.671100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.671141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.671204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.672682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.672746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.672788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.672829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.673183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.673339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.673396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.673451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.673512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.675016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.675065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.675106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.675148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.675491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.675641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.675687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.675728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.675769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.677123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.677174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.677216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.677270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.677663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.677823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.677878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.677933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.677974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.679307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.679375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.679428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.679494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.679855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.680005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.680056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.680098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.680142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.681933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.681984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.682049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.682091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.682526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.682675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.682721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.682762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.682803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.684122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.684183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.684238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.684292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.684702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.684870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.684917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.684968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.685008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.686422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.686478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.686526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.686568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.686943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.687087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.687133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.687174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.687216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.688390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.688442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.688490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.688532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.688917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.689063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.689112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.689153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.689194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.690789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.690840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.745 [2024-07-12 10:58:37.690882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.690922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.691196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.691342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.691386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.691427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.691467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.692714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.692768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.692810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.692866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.693325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.693471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.693527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.693568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.693609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.694668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.694718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.694765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.694807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.695209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.695357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.695415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.695458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.695506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.696816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.696866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.696909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.696959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.697223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.697371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.697421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.697462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.697509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.698699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.698749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.698789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.698830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.699137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.699287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.699333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.699375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.699420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.700748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.700798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.700839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.700879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.701211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.701362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.701407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.701456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.701507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.702671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.702721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.702761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.702803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.703115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.703261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.703306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.703347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.703398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.704687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.704737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.704779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.704820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.705084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.705228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.705276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.705326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.705367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.706578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.706628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.706672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.706715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.707024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.707171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.707216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.707257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.707298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.708496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.708546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.708588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.708628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.708973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.709118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.709163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.709205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.709248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.710563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.710613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.710653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.710694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.711009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.711157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.711202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.711250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.711292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.712441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.712498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.746 [2024-07-12 10:58:37.712540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.712582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.712888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.713035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.713088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.713129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.713172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.714490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.714541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.714581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.714623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.714887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.715035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.715081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.715130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.715172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.716382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.716432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.716472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.716518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.716855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.717006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.717052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.717093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.717134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.718323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.718373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.718414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.718454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.718762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.718912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.718957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.718999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.719042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.720333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.720385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.720425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.720465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.720781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.720928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.720977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.721025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.721067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.722215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.722266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.722307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.722348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.722661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.722808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.722852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.722894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.722934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.724321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.724370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.724411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.724452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.724724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.724869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.724914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.724954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.725003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.726277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.726328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.726369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.727296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.727606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.727754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.727800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.727840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.727881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.729077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.729478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.729893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.731317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.731591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.731738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.733378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.734960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.735977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.738659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.739058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.739449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.740845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.741152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.742561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.743869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.744792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.746277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.748225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.748629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.749339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.750617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.750887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.752512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.754133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.755302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.756515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.758059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.758455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.747 [2024-07-12 10:58:37.759782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.761052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.761356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.762757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.763803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.765407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.766960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.768603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.769154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.770438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.771883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.772153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.773754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.774715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.775992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.777287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.778848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.780197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.781465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.782764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.783090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.784208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.785765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.787385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.788876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.790837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.792124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.793428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.794824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.795091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.796285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.797574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.798873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.800173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.803007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.804344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.805623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.806922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.807226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.808812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.810414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.812002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.813453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.815927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.817235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.818652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.820187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.820458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.821837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.823153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.824457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.825511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.828502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.829808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.831101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.831963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.832231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.833845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.835450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.837059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.837454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.839957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.841260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.842660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.844010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.844322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.845695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.847001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.847999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.848389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.850959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.852272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.853121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.854399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.854673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.856425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.858016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.858421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.858816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.861280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.862692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.864068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.865347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.865655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.867059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.867994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.868386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.868793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.871254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.872112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.873382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.874747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.875014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.876682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.877082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.877474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.878623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.881260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.748 [2024-07-12 10:58:37.882507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.883766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.885064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.885372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.886649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.887047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.887439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.889087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.891200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.892622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.894148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.895803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.896072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.896569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.896966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.897934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.899205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.901460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.902746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.904048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.905351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.905635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.906123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.906528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.908142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.909696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.912394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.913937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.915622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.917172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.917535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.918026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.919035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.920312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.921618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.924037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.925335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.926642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.927627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.928067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.928559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.930120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.931650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.933203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:02.749 [2024-07-12 10:58:37.935761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.937109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.938119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.938520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.938917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.940536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.942167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.943783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.945320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.947824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.949173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.949571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.949979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.950250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.951612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.952912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.954212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.955070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.957555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.958182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.958576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.959300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.959584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.961050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.962514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.963502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.964768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.966383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.966792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.968429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.968828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.969133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.969768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.970164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.970842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.971997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.974127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.974996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.975406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.975818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.976175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.976671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.977067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.977459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.977856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.979681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.980083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.980474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.980871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.981292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.981795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.982192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.982589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.982981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.984777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.985176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.985574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.985965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.986386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.986884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.987282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.987680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.988073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.989850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.990250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.990647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.991040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.991447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.012 [2024-07-12 10:58:37.991947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.992338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.992739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.993129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.994868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.995270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.995667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.996063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.996447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.996944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.997353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.997748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.998141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:37.999894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.000296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.000697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.001085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.001369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.001868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.002268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.002666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.003056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.004832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.005233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.005633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.006023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.006339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.006831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.007230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.007627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.008016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.009889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.010289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.010686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.011077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.011438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.011931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.012327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.012729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.013121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.014972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.015384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.015796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.015846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.016213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.016708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.017104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.017501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.017893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.019725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.019788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.019844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.019898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.020262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.020766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.020821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.020863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.020904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.022343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.022405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.022448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.022494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.022913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.023060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.023124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.023178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.023220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.024703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.024769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.024822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.024864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.025243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.025394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.025440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.025487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.025529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.027020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.027071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.027112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.027154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.027486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.027638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.027684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.027725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.027766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.029140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.029190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.029232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.029285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.029726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.029881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.029933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.029978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.030018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.031242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.031292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.031334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.031374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.013 [2024-07-12 10:58:38.031711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.031866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.031915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.031956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.031997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.033232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.033281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.033321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.033362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.033691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.033841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.033887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.033928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.033968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.035213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.035274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.035316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.035357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.035628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.035777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.035822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.035863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.035909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.037196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.037248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.037294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.037334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.037621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.037768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.037818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.037860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.037900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.039183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.039234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.039291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.039345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.039776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.039925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.039972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.040014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.040055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.041262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.041315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.041356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.041397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.041719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.041875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.041934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.041975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.042022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.043188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.043245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.043286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.043327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.043727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.043882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.043949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.044003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.044044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.045324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.045390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.045432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.045473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.045743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.045889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.045937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.045979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.046020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.047293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.047357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.047398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.047439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.047711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.047856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.047901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.047942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.047982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.049772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.049822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.049863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.049903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.050207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.050356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.050400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.050441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.050488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.051747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.051797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.051837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.051878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.052192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.052339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.052384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.052425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.052466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.053729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.053780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.053821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.053862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.014 [2024-07-12 10:58:38.054146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.054296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.054345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.054390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.054431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.055727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.055777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.055817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.055857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.056129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.056282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.056329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.056371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.056412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.057675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.057725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.057765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.057810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.058080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.058231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.058277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.058319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.058360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.059832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.059884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.059932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.059993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.060258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.060405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.060454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.060502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.060543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.061721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.061777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.061819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.061867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.062130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.062278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.062330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.062371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.062412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.063671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.063735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.063778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.063818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.064176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.064320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.064365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.064410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.064450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.065671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.065720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.065762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.065802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.066209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.066359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.066404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.066445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.066491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.067684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.067734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.067774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.067814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.068169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.068318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.068363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.068404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.068445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.069700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.069749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.069791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.069831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.070131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.070276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.070325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.070365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.070424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.071659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.071708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.071752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.071793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.072119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.072264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.072309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.072350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.072398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.073684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.073737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.073780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.073821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.074083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.074232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.074284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.074326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.074371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.075537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.075587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.075629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.075670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.075935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.015 [2024-07-12 10:58:38.076090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.076143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.076184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.076230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.077395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.077445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.077491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.077533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.077997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.078150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.078212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.078254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.078295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.079533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.079588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.079629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.079670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.079936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.080083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.080128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.080168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.080209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.081528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.081589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.081631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.081671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.081935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.082082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.082127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.082168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.082209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.084095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.084145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.084185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.085469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.085739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.085889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.085936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.085977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.086018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.087203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.088829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.090316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.091896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.092319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.092463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.092860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.093764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.095032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.097323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.098588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.099873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.101396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.101744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.102230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.102634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.104238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.105805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.108460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.109947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.111545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.113048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.113428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.113919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.114943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.116219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.117516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.119895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.121208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.122723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.123490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.123906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.124394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.125957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.127598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.129125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.131756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.133291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.134675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.016 [2024-07-12 10:58:38.135065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.135435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.136647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.137910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.139189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.140705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.143236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.144760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.145382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.145779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.146184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.147742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.149345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.150957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.152608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.155320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.156658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.157048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.157439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.157715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.159085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.160376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.161906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.162788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.165553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.166143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.166539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.167038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.167308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.168994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.170630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.172242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.173180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.175701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.176098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.176495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.177780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.178139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.179543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.181071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.181850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.183451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.185220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.185626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.186170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.187455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.187731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.189466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.191040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.192038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.193298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.194881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.195287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.196628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.197950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.198259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.199948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.200609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.017 [2024-07-12 10:58:38.201937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.203253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.204943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.206340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.207605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.208903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.209173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.210030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.211552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.213095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.214682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.216789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.218061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.219346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.220864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.221133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.222583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.223847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.225131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.226653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.230316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.231881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.233315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.234842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.235196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.236580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.237956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.239489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.240921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.243392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.244696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.246202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.247299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.247572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.248946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.250247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.251775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.252448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.255552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.256873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.258395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.259031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.259299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.260884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.262470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.263969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.264357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.266874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.268410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.269579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.270919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.271231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.272622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.274153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.274939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.275329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.277937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.279462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.280121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.281389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.281666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.283348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.284811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.285202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.285599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.288293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.289466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.290804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.292078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.292385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.294018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.294776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.295167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.295560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.297312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.298157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.299442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.300574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.300882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.302264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.279 [2024-07-12 10:58:38.303337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.303735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.304122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.306417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.307602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.308870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.310149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.310453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.310952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.311349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.311749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.312139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.313767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.314168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.314571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.314963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.315409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.315907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.316312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.316712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.317099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.318794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.319198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.319594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.319986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.320340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.320835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.321234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.321648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.322046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.323843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.324245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.324647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.325041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.325395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.325891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.326299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.326706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.327101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.328969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.329369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.329772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.330164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.330565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.331053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.331448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.331846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.332240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.334101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.334511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.334898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.335288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.335638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.336122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.336528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.336919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.337325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.340088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.340496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.340889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.341281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.341557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.342043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.343500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.343884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.344276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.347203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.347635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.348025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.348419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.348699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.349328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.350841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.351236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.351626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.354087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.354754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.355149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.355546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.355900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.357021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.358077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.358870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.359260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.361335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.362453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.362853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.363247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.363619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.365207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.365805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.367057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.367446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.369194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.370770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.371155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.371552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.371942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.373508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.280 [2024-07-12 10:58:38.373907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.375566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.375953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.377807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.379325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.379740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.379787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.380149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.380644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.382060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.382492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.384071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.386548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.386606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.386648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.386689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.387118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.388852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.388908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.388950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.388991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.390437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.390502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.390544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.390585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.390874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.391023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.391071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.391113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.391160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.392632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.392682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.392724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.392764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.393029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.393183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.393230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.393272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.393313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.394590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.394645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.394685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.394726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.394995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.395145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.395190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.395234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.395278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.396556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.396615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.396657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.396719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.396986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.397135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.397189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.397230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.397270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.398505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.398555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.398596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.398650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.398920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.399070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.399130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.399173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.399215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.400560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.400643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.400686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.400750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.401020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.401171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.401245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.401290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.401332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.402692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.402744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.402799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.402841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.403243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.403390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.403435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.403476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.403525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.404777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.404827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.404868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.404909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.405178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.405327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.405375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.405427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.405470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.406774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.406834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.406875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.406916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.407183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.407329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.281 [2024-07-12 10:58:38.407374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.407414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.407455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.409490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.409546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.409586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.409627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.409908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.410057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.410102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.410142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.410183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.411490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.411542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.411583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.411624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.411902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.412050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.412095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.412136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.412179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.413543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.413601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.413645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.413687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.414151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.414297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.414366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.414410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.414451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.415661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.415712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.415753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.415793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.416190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.416339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.416390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.416431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.416471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.417575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.417627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.417668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.417710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.418006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.418152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.418203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.418247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.418288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.419747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.419801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.419841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.419882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.420149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.420302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.420348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.420389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.282 [2024-07-12 10:58:38.420429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.542 [2024-07-12 10:58:38.599392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.542 [2024-07-12 10:58:38.600895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.542 [2024-07-12 10:58:38.600951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.542 [2024-07-12 10:58:38.602435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.542 [2024-07-12 10:58:38.605801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.542 [2024-07-12 10:58:38.607410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.609114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.610679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.612437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.613973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.615508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.616703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.619188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.620703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.622231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.623046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.624672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.626190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.627703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.628180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.631154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.632749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.634444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.635102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.636976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.638499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.639765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.640152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.642925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.644450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.645313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.646758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.648668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.650194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.650716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.651101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.654047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.655599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.656334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.657601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.659485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.660699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.661088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.661474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.661503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.661766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.663539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.664879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.666143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.667644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.668564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.668956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.669348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.669745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.543 [2024-07-12 10:58:38.670125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.672303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.673730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.674579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.676042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.677876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.678500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.678891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.679297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.679582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.680809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.682237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.683790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.685245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.686119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.686585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.687932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.688324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.688603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.690437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.690861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.691272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.691675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.692541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.692957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.693360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.693757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.694140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.695505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.695907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.696304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.696703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.697557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.697967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.698366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.698764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.699043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.700547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.700950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.701348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.701754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.702705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.703120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.703176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.703586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.704062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.705279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.705695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.706091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.706158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.543 [2024-07-12 10:58:38.706760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.707165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.707222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.707625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.708069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.709758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.709823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.710212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.710275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.710850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.711253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.711309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.711714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.712174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.713845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.713909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.714314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.714374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.714926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.715332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.715395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.715806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.716253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.717864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.717927] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.718331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.718389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.718934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.719344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.719407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.719839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.720249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.721752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.721815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.722215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.722265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.722801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.723222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.723283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.723692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.724066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.725553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.725643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.726044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.726095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.726618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.727025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.727082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.727494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.727886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.729344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.729432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.729830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.729880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.730389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.730805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.730871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.731275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.731632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.733091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.733156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.733562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.733623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.734085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.734507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.734577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.544 [2024-07-12 10:58:38.734983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.735431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.736819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.736885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.737277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.737326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.737868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.738270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.738325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.738730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.739144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.740576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.740650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.741061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.741126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.741563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.743036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.743091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.743703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.743977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.745257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.745319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.746767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.746822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.747243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.748894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.748957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.749737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.750036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.751251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.751314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.751709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.751759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.752246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.752861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.752918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.754298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.754693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.757307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.757370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.806 [2024-07-12 10:58:38.758907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.758972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.759545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.761072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.761126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.762260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.762647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.764743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.764806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.765466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.765523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.765940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.767384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.767438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.767826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.768297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.770737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.770799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.771322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.771381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.771801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.773429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.773488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.774778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.775160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.777296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.777358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.778880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.778932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.779346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.780029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.780084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.781365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.781642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.782869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.782929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.783327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.783383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.783804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.785339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.785394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.786913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.787195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.789609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.789672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.791194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.791245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.791835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.792234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.792286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.793234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.793542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.795544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.795606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.797032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.797083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:03.807 [2024-07-12 10:58:38.797555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.799167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.799216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.800548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.800934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.801831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.803125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.803173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.803214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.803622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.805183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.805245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.805975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.806253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.807140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.807191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.807232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.807274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.807817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.808208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.808253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.808294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.808615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.809502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.809554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.809595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.809637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.810040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.810087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.810127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.810176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.810550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.811425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.811476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.811523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.811568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.811971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.812017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.812058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.812099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.812464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.813745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.813796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.807 [2024-07-12 10:58:38.813837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.813878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.814315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.814360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.814401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.814442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.814709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.815691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.815741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.815782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.815823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.816262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.816308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.816349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.816390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.816658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.817630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.817682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.817728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.817769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.818258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.818306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.818348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.819975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.820261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.821135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.821192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.821237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.821279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.821736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.821785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.821826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.821867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.822181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.823056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.823107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.823156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.823199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.823694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.823742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.823784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.823838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.824308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.825230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.825282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.825338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.825381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.825787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.825835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.825876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.825916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.826276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.827212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.827268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.847866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.847937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.849427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.849473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.849982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.851300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.851348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.852864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.854444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.854504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.855787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.855832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.856269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.857791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.857839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.859529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.862359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.862414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.863747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.863791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.864302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.864705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.864750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.866369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.868111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.868172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.869650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.869700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.870098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.871621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.871673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.872751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.875263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.875318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.876839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.876885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.877280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.877720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.877769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.879048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.880546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.880601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.880985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.881028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.881424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.882728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.808 [2024-07-12 10:58:38.882775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.884331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.887121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.887185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.888709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.888756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.889154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.889552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.889597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.889994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.892665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.892720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.893126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.893171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.893579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.895110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.895157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.896671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.898991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.899047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.900319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.900365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.900764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.902310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.902367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.903043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.905747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.905803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.906187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.906231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.906754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.908172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.908220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.909642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.912417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.912479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.914114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.914161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.914567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.915812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.915859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.916242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.919067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.919124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.920636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.920687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.921194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.922567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.922616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.924130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.925735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.925790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.927003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.927049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.927492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.929011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.929058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.930637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.933495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.933568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.934982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.935027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.935540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.935934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.935980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.937499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.939424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.939487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.941050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.941094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.941511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.943034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.943082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.944251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.946725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.946781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.948295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.949809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.950306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.951905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.951952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.953503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.954783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.955180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.955223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.956281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.956758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.957665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.957715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.959289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.962109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.962337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.963877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.963928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.964404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.965930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.967539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.969059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.970456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.809 [2024-07-12 10:58:38.970902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.972182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.973525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.973922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.974811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.976097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.976512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.978042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.978840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.980310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.982248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.982650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.983037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.984594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.986502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.988196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.988788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.990064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.991528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.991923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.992894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.994186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.995812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.997126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:03.810 [2024-07-12 10:58:38.997615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:38.998001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.000408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.001146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.001538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.001923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.003596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.005116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.005609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.006878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.008533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.009329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.010598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.011792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.013457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.014689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.015078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.015459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.016974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.018469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.019967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.020353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.021158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.021572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.023194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.023599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.023987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.024372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.024831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.025227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.025622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.026008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.027815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.028212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.028609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.029004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.029976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.030374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.030774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.031163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.032967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.033363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.033760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.034153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.035119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.035525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.035923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.036312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.038237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.038652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.039038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.039424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.040233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.040264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.040657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.041047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.041448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.073 [2024-07-12 10:58:39.041840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.043442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.043513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.043910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.044303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.045260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.045682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.045740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.046133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.046605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.047954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.048015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.048407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.048822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.049675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.049736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.050128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.050529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.050853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.052033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.073 [2024-07-12 10:58:39.052439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.074 [2024-07-12 10:58:39.052847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:04.074 [2024-07-12 10:58:39.053281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.053780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.054178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.054582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.054640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.055074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.056574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.056640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.057029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.057415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.058318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.058726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.058777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.059163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.059588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.060581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.060978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.061023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.061420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.061940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.062340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.062390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.062796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.063254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.064247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.064652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.064697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.065097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.065606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.066003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.066051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.066436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.066851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.067890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.068292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.068337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.068733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.069292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.069701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.069753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.071149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.071561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.072789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.074397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.074464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.075986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.076396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.077043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.077094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.078592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.078861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.079823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.080220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.080268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.081739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.082150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.082730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.082785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.084057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.084326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.085365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.086449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.086502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.087069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.087478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.088106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.088156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.089713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.090176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.091143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.092426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.092475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.094004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.094412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.094824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.094875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.096366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.096640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.097640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.098244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.098290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.099929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.100528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.101915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.101964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.102602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.102872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.103860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.105307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.105354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.105747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.074 [2024-07-12 10:58:39.106154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.107785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.107835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.109178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.109541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.110422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.111675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.111722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.113328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.113925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.115361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.115408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.115977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.116245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.117136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.118644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.118691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.119885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.120329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.121713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.121763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.123276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.123620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.124671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.125687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.125733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.126706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.127163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.128458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.128511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.130017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.130330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.131248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.132547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.132595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.134105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.134601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.135417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.135463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.136905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.137291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.138218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.139513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.139562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.141066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.141579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.142925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.142973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.144287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.144560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.145552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.146161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.146206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.147819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.148403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.150025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.150080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.151713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.151997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.152959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.153009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.154281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.154329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.154776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.156307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.156356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.157223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.157495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.159101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.159159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.159201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.160470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.160900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.162413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.162461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.163671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.163938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.164861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.164913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.164954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.164995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.165396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.166272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.166319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.167131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.167399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.168279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.168331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.168376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.168417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.168889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.170202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.170249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.170299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.170570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.171583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.171633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.171675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.075 [2024-07-12 10:58:39.171716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.172185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.172231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.172272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.172313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.172582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.173571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.173621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.173662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.173705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.174108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.174163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.174205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.174247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.174628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.175515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.175567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.175614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.175654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.176061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.176107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.176152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.176193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.176608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.177537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.177595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.177638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.177683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.178087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.178133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.178174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.178215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.178536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.179788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.179842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.179883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.179925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.180402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.180450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.180503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.182024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.182362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.183311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.183361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.183403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.183452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.183913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.183960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.184001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.184042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.184344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.185282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.185337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.185378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.185420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.185961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.186009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.186050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.186099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.186547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.187628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.187682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.187724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.187765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.188190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.188236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.188277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.188318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.188625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.189502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.189557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.189603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.189643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.191520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.191577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.191618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.191659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.191928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.193216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.193284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.193326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.193367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.193777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.193824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.194646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.194692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.195012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.197205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.197262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.197303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.197340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.197759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.199324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.199378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.199421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.199697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.201538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.201594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.201637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.076 [2024-07-12 10:58:39.201678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.202426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.202477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.202530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.203977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.204249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.205165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.205860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.205908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.205949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.206410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.206457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.207740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.207786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.208060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.209787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.209845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.211115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.211160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.213080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.213135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.214398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.214443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.214721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.217167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.217222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.217952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.217998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.219092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.219146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.220412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.220457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.220799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.222374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.222429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.223697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.223744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.225676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.225731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.226581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.226627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.077 [2024-07-12 10:58:39.226897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.228246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.228301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.229435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.229490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.231134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.231189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.232755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.232809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.233115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.237883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.237940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.238432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.238476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.239223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.239276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.240957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.241000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.241272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.242744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.242800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.244066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.244112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.245989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.246054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.247233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.247279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.247556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.248887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.248941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.250558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.250601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.252407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.252469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.254014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.254059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.254331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.259430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.259498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.260013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.260059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.261037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.261092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.262379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.262426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.080 [2024-07-12 10:58:39.262751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.267748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.267804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.269311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.269357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.270878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.270931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.272108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.272155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.272565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.276806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.276861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.278493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.278541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.280273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.280328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.281985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.282038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.282348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.286181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.286237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.287529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.287574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.288812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.288874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.290352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.290399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.290670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.294593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.294649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.295360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.295405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.297042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.297097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.298388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.298434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.298709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.303268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.303323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.304179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.304225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.305728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.305783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.306305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.306348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.306623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.311618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.311673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.313075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.313121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.315033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.315088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.316105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.316151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.316505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.320486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.320542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.322054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.322098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.324109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.324170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.325845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.325892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.326161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.330764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.331660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.331709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.332983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.334905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.334960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.335873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.335918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.336188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.339960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.341185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.342 [2024-07-12 10:58:39.342049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.342787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.342841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.344119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.344164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.344433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.347688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.348992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.350521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.351473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.352234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.352288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.353590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.353634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.353964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.358159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.359460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.360921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.362527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.363856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.363950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.366907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.368208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.369727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.370312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.370734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.372192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.373757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.375236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.375543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.379653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.380694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.381753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.383019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.384561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.385206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.386634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.387053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.387427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.389974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.390372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.390763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.391829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.393810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.395066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.396258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.397202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.397477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.398936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.400403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.400801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.402276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.403334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.404809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.406119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.406634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.406908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.409960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.410931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.412002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.413176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.414909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.415352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.416946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.417342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.417619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.420641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.422001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.422498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.422888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.424384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.425106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.425260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.429476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.429883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.343 [2024-07-12 10:58:39.431450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.431839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.432402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.432810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.434404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.434804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.435078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.437732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.438964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.439578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.439967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.441170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.442236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.443191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.444078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.444395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.449226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.449633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.450026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.450416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.451251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.452860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.453265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.453663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.454034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.457246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.457309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.457706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.458581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.459892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.459948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.460631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.461021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.461369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.463590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.463654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.464041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.464431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.465219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.466620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.467015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.467071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.467432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.469585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.469641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.470032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.470425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.471395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.472866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.472917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.473310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.473710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.476918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.477320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.477378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.477771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.479351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.479407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.480537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.481247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.481572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.486062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.486118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.486513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.486562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.487049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.487601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.488877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.488923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.489241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.493497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.493559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.494148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.494192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.494949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.495012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.495410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.344 [2024-07-12 10:58:39.495466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.495847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.500266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.500331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.500728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.500777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.501732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.501789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.502818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.502862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.503176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.507986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.508042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.508430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.508492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.509336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.509390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.510740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.510784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.511113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.515305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.515369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.515892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.515938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.516696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.516760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.517150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.517207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.517554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.521944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.522015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.522409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.522459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.523292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.523347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.524513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.524558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.524839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.527524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.527580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.528177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.528223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.529469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.529531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.530822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.530881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.345 [2024-07-12 10:58:39.531223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.536775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.536855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.538321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.538366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.539308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.539363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.540971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.541031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.541382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.545601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.545658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.606 [2024-07-12 10:58:39.546446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.546497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.547911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.547968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.548578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.548623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.548973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.552446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.552506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.553612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.553659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.555379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.555436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.556214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.556266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.556545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.560085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.560141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.561246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.561290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.562679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.562753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.563572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.563618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.563939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.566942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.567000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.568253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.568299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.570184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.570243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.571669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.571726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.571994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.574954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.575010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.576666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.576713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.578458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.578519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.580049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.580095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.580522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.586345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.586405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.587851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.587897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.589649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.589703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.591045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.591090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.591443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.594960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.595016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.596210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.596255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.597889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.597945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.599243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.599288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.599566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.604407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.604463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.604510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.605301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.606365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.606420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.607352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.607398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.607678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.611343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.612757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.612803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.612851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.614528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.614582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.616197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.616249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.616574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.620475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.620535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.620575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.620616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.622566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.622620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.623219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.623264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.623540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.627636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.627688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.627729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.627770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.629936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.629996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.630384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.630430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.630707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.634812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.634862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.607 [2024-07-12 10:58:39.634903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.634944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.636653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.636707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.638220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.638265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.638621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.642657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.642709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.642750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.642791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.643252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.643298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.643339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.643380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.643724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.647845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.647898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.647943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.647984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.648391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.648437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.648479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.648525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.648865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.653037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.653089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.653130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.653170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.653658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.653705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.653746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.653787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.654125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.658259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.658312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.658361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.658403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.658816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.658863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.658904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.658944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.659294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.663430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.663501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.663543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.663583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.664032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.664078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.664119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.664164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.664464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.668501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.668557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.668597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.668638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.670171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.670232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.670279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.670319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.670597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.672329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.672384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.672434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.672474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.672928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.672974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.673015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.673056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.673326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.676381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.676440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.676497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.676539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.676944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.676990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.677044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.677086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.677358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.679070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.680514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.680565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.680606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.681015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.682124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.682171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.682214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.682493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.686332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.687505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.687553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.687594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.688003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.688050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.688097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.688497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.688770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.608 [2024-07-12 10:58:39.692556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.693972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.694023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.694064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.694470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.694528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.696048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.696094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.696419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.701040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.701096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.702466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.702517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.702926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.704452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.704504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.704546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.704849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.708318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.709114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.709163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.710372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.711968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.712023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.712066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.713084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.713421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.716989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.718273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.718321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.719606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.720018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.720801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.720851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.721773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.722055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.725141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.726675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.726723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.727697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.728108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.729556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.729608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.731084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.731358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.733654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.734504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.734560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.735825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.736298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.737838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.737886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.738581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.738854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.742792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.744010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.744057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.745042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.745518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.746621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.746670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.747937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.748254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.752048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.753347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.753396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.754909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.755397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.756309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.756359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.757656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.758033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.761284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.762281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.762328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.763977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.764385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.765847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.765912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.767467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.767811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.770555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.771830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.771879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.773170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.773591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.774237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.774283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.775600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.775880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.609 [2024-07-12 10:58:39.779032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.780159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.780209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.780760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.781167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.782444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.782499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.783771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.784046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.787220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.788882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.788938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.790127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.790658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.792123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.792171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.792568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.792843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.796834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.798185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.610 [2024-07-12 10:58:39.798236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.799580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.800004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.800782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.800830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.801901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.802200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.805531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.807092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.807148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.807925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.808332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.809980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.810036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.811708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.811981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.814629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.815724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.815773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.817044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.817505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.819033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.819081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.819642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.819916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.824026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.824973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.825022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.826270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.826808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.828154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.828201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.829570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.829891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.834027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.835612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.835678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.837166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.837613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.838182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.838233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.839877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.840330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.843942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.844554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.846027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.846079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.846492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.848023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.848073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.849306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.849588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.852920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.852977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.854276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.855802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.856286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.857721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.857771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.859171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.859443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.863079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.863736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.864127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.864526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.865051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.865931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.871 [2024-07-12 10:58:39.865979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.867285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.867701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.872122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.873570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.874709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.875410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.875828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.876261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.876305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.877891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.878190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.882656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.884061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.885569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.886403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.886821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.887915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.888069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.891456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.893105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.893509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.895046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.895488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.896615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.897899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.899413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.899764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.904951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.905426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.906799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.907263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.908934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.909414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.910947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.912259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.912680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.918962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.919359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.920992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.921389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.922146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.923771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.924160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.924558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.924913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.928971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.929372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.930391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.931213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.932368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.932774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.933168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.933854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.934129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.936442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.938101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.938504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.940046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.940943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.941344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.942690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.943188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.943462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.946495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.947438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.948347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.948742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.949640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.951000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.951256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.954949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.955902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.956967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.957749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.958233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.958641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.959037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.959429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.959769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.961945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.962348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.962761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.963148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.963977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.964376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.964780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.965177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.965571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.967741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.967806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.968205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.968603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.969558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.969958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.970362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.970766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.872 [2024-07-12 10:58:39.971089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.973379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.973439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.973840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.974246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.975179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.975236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.975636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.976027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.976396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.978779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.978843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.979255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.979660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.980621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.981021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.981416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.981465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.981832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.983934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.984334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.984391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.984795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.985731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.986132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.986180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.986581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.986963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.990082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.990145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.990809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.990859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.992262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.992321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.993793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.994350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.994633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.997962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.998018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.999280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.999327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:39.999755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.000940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.002456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.002520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.002968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.007820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.007878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.008487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.008549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.009467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.009536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.010770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.010819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.011237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.014725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.014844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.016015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.016127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.017691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.017833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.018769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.018831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.019304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.023206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.023264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.023671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.023718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.025293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.025353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.025885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.025944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.026255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.029884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.029946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.031017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.031068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.032320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.032377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.032772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.032821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.033134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.038038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.038097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.039598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.039647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.040441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.040521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.040913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.040961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.041240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.046550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.046610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.047994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.048043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.049447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.049522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.049913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.049960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.873 [2024-07-12 10:58:40.050313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.874 [2024-07-12 10:58:40.054570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.874 [2024-07-12 10:58:40.054629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.874 [2024-07-12 10:58:40.055885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.874 [2024-07-12 10:58:40.055933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.874 [2024-07-12 10:58:40.057939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.874 [2024-07-12 10:58:40.058000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.874 [2024-07-12 10:58:40.058750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.874 [2024-07-12 10:58:40.058797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:04.874 [2024-07-12 10:58:40.059170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.063402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.063461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.064917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.064974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.066681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.066739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.068423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.068473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.068904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.073035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.073094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.073827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.073877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.075927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.075984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.077574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.142 [2024-07-12 10:58:40.077637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.077910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.081611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.081668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.083176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.083223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.085238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.085295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.086813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.086864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.087139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.091251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.091308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.143 [2024-07-12 10:58:40.092591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.092639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.093828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.093885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.095316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.095365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.095646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.099318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.099377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.100639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.100686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.102614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.102670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.103308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.144 [2024-07-12 10:58:40.103356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.103638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.108849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.108904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.110045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.110094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.111783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.111839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.113103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.113149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.113460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.118924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.118982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.119037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.120529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.145 [2024-07-12 10:58:40.121633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.121688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.123180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.123235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.123701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.127280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.127930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.127978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.128019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.129807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.129863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.131383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.131430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.146 [2024-07-12 10:58:40.131715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.135716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.135772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.135814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.135855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.136863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.136919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.138201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.138249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.138567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.141469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.141530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.141577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.141620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.143523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.143581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.145095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.145143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.145414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.148778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.147 [2024-07-12 10:58:40.148830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.148871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.148913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.149749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.149802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.150776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.150824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.151146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.152131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.152183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.152224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.152265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.154051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.154107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.155390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.155437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.155713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.148 [2024-07-12 10:58:40.156806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.156857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.156900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.156942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.157395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.157443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.157493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.157535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.157875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.158876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.158927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.158968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.159016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.159464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.159518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.159561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.159604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.159930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.160925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.160976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.161028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.149 [2024-07-12 10:58:40.161073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.161623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.161673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.161716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.161759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.162186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.163127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.163181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.163226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.163268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.163685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.150 [2024-07-12 10:58:40.163734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.151 [2024-07-12 10:58:40.163776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.151 [2024-07-12 10:58:40.163819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.151 [2024-07-12 10:58:40.164172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.151 [2024-07-12 10:58:40.165130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.151 [2024-07-12 10:58:40.165187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.151 [2024-07-12 10:58:40.165228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.151 [2024-07-12 10:58:40.165271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.151 [2024-07-12 10:58:40.165691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.151 [2024-07-12 10:58:40.165741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.165784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.165830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.166179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.167229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.167281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.167322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.167364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.169077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.169137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.169179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.169221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.169510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.170566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.170617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.170668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.152 [2024-07-12 10:58:40.170711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.171166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.171216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.171261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.171314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.171593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.172653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.174107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.174161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.174209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.174629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.174679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.174727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.174770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.175043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.175997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.177317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.177369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.177421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.153 [2024-07-12 10:58:40.177844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.179464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.179519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.179563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.179941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.180990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.182375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.182424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.182473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.182887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.182947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.182994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.184218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.184501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.185542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.154 [2024-07-12 10:58:40.185595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.187108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.187156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.187694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.187751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.188140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.188185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.188460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.189550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.191077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.191138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.192038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.192492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.155 [2024-07-12 10:58:40.193791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.193840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.193882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.194155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.195313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.196025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.196074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.197347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.199250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.199307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.199349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.200265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.200547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.201589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.203128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.203186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.203573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.204069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.156 [2024-07-12 10:58:40.205278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.205329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.206591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.206917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.207811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.209370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.209424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.210949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.211371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.212893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.212943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.214370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.214774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.218296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.219601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.219652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.221172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.221730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.223013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.157 [2024-07-12 10:58:40.223062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.224352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.224633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.227465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.228748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.228796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.230085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.230507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.231051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.231100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.232371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.232652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.233559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.233957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.234004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.234397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.234815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.158 [2024-07-12 10:58:40.236254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.236304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.237782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.238060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.239048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.240370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.240420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.241934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.242345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.243934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.243990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.244411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.244692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.245870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.247533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.247581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.248187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.248604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.249887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.159 [2024-07-12 10:58:40.249945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.250337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.250732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.251635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.252035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.252097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.253072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.253596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.253995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.254042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.254865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.255213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.256097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.257161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.257210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.258149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.258567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.259466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.259519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.259903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.260291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.261200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.261612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.261665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.160 [2024-07-12 10:58:40.262948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.263363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.263777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.263829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.264214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.264553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.265448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.266198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.266253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.267424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.267841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.268253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.268299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.268690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.268968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.269926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.161 [2024-07-12 10:58:40.270357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.270407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.271681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.272093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.273600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.273653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.274038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.274436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.275387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.276547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.276600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.277424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.277853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.279437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.279500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.280992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.281353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.162 [2024-07-12 10:58:40.282391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.283386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.284866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.284931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.285362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.286284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.286340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.287364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.287814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.289356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.289415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.290734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.291236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.291779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.292194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.292240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.292641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.292990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.294224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.163 [2024-07-12 10:58:40.294642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.295040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.295431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.296029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.296431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.296488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.296879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.297295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.298686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.299088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.299479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.299875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.300394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.300808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.300862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.301247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.301535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.303971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.304373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.304771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.164 [2024-07-12 10:58:40.305160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.305669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.306070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.306123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.306517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.306973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.309111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.309522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.309928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.310318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.310848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.311253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.311504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.312843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.313246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.313660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.314051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.314383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.314797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.165 [2024-07-12 10:58:40.315194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.166 [2024-07-12 10:58:40.315594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.166 [2024-07-12 10:58:40.316062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.166 [2024-07-12 10:58:40.317462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.166 [2024-07-12 10:58:40.317894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.166 [2024-07-12 10:58:40.318287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.166 [2024-07-12 10:58:40.318684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.166 [2024-07-12 10:58:40.319554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.319955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.320355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.320763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.321095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.322532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.322939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.323345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.323752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.324600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.167 [2024-07-12 10:58:40.325030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.168 [2024-07-12 10:58:40.325426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.168 [2024-07-12 10:58:40.325829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.168 [2024-07-12 10:58:40.326300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.328083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.328492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.328916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.329337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.330223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.330630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.331022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.331415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.331824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.333111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.333524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.333913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.334685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.336423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.337958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.339258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.340454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.340774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.342745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.343157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.343550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.345147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.346358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.347943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.348135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.349440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.430 [2024-07-12 10:58:40.349848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.350240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.350633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.351158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.352513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.353793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.355087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.355407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.357614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.357671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.358147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.358540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.360250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.360706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.362069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.362463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.362831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.364185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.364245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.365732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.366129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.367421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.368361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.369835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.370883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.371242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.372541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.372597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.373272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.374413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.376269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.376328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.376720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.377109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.377391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.379702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.381223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.381273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.381672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.383837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.385414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.387104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.387160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.387547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.391521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.391580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.393187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.393242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.395063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.396672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.396731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.397737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.398072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.400085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.400150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.400541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.400587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.402475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.402538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.403917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.405439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.405722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.407924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.407981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.409497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.409544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.410012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.410405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.410960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.411009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.411308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.416585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.416642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.417940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.417986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.419203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.419260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.419653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.419698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.419980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.422366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.422422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.423661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.423708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.425325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.425382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.426663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.426711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.426984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.428751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.428808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.430088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.430135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.432074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.432130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.432899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.432950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.433222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.435647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.435707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.436101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.436148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.437841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.437898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.439161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.439209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.439524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.441976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.442049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.443540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.443591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.445468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.445529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.445913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.445963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.446430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.448617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.448674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.450171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.450218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.451983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.452040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.453334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.453382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.453661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.455016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.455072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.456723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.456769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.458567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.458630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.460202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.460249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.460529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.462945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.463002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.463961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.464008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.464906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.464963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.466231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.466279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.466556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.468415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.468471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.469748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.469795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.471756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.471813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.472327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.472372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.472689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.474873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.474929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.476453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.476505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.431 [2024-07-12 10:58:40.478336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.478395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.479902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.479958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.480235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.481573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.481629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.482758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.482805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.484515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.484571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.486082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.486129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.486544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.488836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.488901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.490449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.490499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.491384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.491438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.492930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.493000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.493275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.494771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.494826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.494870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.496136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.498037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.498093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.499235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.499284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.499761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.501016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.502627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.502684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.502737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.504663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.504719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.505884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.505932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.506237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.508117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.508174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.508216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.508258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.509382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.509437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.510696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.510743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.511063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.511955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.512008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.512054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.512096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.514068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.514128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.515657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.515706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.515979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.516990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.517042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.517085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.517131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.519190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.519247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.520827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.520874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.521144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.522093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.522145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.522187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.522228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.524173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.524229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.524626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.524673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.525022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.525906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.525957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.525998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.526039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.527975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.528031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.528602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.528650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.528924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.529816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.529871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.529912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.529954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.530429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.530478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.530529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.530572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.530897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.531820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.531871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.531917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.531959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.532405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.532454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.532502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.532544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.532826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.533834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.533885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.533926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.533968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.534423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.534472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.534522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.534567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.534839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.535954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.536008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.536052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.536095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.536510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.536566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.536611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.536658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.536929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.537810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.537863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.537905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.537947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.538350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.538406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.538450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.538508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.538780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.539662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.539714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.539761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.539803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.540763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.540830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.540875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.540918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.541192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.542081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.543533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.543587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.543629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.544043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.544098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.544141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.544188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.544458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.545341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.545750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.545800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.545843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.546450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.546510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.546556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.546599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.432 [2024-07-12 10:58:40.546871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.547764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.549192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.549241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.549283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.549695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.551070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.551120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.551170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.551442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.552450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.552509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.552900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.552945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.553356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.553410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.553459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.555114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.555397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.556396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.557669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.557719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.559001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.559412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.559466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.560411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.560459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.560887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.561974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.563553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.563621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.565134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.565554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.566741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.566790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.566833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.567155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.568084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.568697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.568746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.569131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.569901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.569966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.570010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.570404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.570730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.571644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.573221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.573274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.573892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.574311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.575562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.575614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.575999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.576477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.577409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.578686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.578736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.580252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.580744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.582030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.582080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.583370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.583649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.584589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.584999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.585045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.585980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.586457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.587347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.587398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.588158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.588577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.589793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.591142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.591194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.592841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.593318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.594616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.594666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.596178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.596455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.597542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.599158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.599215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.600516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.600930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.601601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.601652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.602612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.602889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.603881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.604277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.604325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.605398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.605861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.606639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.606694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.608123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.608436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.609731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.610162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.610211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.611485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.611909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.613477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.613531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.614692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.615017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.615911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.616309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.616370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.616762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.617175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.618663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.618731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.619422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.619730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.620796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.621208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.621259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.433 [2024-07-12 10:58:40.622503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.695 [2024-07-12 10:58:40.623013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.695 [2024-07-12 10:58:40.624428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.695 [2024-07-12 10:58:40.624492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.695 [2024-07-12 10:58:40.624883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.625309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.626586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.627006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.627058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.627443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.627986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.628409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.628476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.628880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.629269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.630286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.630709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.630769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.631152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.631597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.632002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.632053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.632448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.632782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.633812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.634215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.634266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.634655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.635094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.635506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.635564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.635967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.636296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.637319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.637732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.638123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.638172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.638634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.639048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.639108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.639507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.639846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.641262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.641330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.641722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.642112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.642620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.643027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.643080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.643467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.643852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.644913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.645327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.645722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.646115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.646639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.647036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.647084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.647470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.647910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.649317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.649724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.650118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.650511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.651019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.651414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.651466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.651864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.652264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.653518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.653921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.654313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.654712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.655261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.655669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.655723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.656107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.656390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.657849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.658251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.658651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.659048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.659660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.660076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.660126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.660528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.660906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.662300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.662711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.663102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.663494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.663948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.665457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.665617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.668123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.668533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.668924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.669811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.670166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.671535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.673018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.673949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.674275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.675494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.675897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.676623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.677718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.679496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.679903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.680296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.680693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.680970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.683577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.685142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.685538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.685931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.687236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.688561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.689683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.690634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.690911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.692167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.693777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.694171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.695849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.696676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.697077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.698687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.699082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.699361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.700809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.702396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.702798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.704443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.705281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.705763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.707112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.708517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.708882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.710445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.710857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.711252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.712647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.714187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.696 [2024-07-12 10:58:40.714898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.715155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.717259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.717317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.718289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.719585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.719999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.720398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.720813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.722056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.722392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.724365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.724425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.725674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.727189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.728076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.728474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.729969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.731607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.731888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.734105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.734172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.735545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.737060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.737831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.738246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.739768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.741149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.741463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.744021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.745599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.745659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.747281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.748047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.748103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.748498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.749787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.750125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.752060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.752118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.753795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.753842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.755626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.757181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.757587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.757636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.758032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.760221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.760279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.761796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.761844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.764034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.765583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.765639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.767217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.767501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.769374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.769432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.770705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.770753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.772683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.772741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.773426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.774923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.775203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.776581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.776640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.777030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.777081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.777501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.778774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.780034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.780082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.780357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.782463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.782526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.783856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.783904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.784661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.784719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.785109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.785160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.785438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.787854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.787915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.788874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.788922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.790772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.790836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.792402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.792457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.792738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.794432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.794497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.795767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.795815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.797758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.797814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.798587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.798638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.798911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.801324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.801390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.801787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.801838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.803266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.803324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.804590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.804650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.804973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.807416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.807480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.809129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.809174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.811208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.811265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.811662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.811718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.812132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.814313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.814372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.815883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.815931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.817925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.817992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.819477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.819532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.819806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.821112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.821171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.822309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.822358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.824054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.824113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.825627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.825674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.826026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.828289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.828355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.829931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.697 [2024-07-12 10:58:40.829978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.830816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.830877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.832185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.832232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.832564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.834318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.834380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.835830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.835880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.837879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.837942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.839304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.839349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.839757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.841940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.842000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.843284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.843331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.844373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.844431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.845734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.845782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.846057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.847283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.847342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.847740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.847794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.849408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.849466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.850744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.850791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.851064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.853174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.853232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.853281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.854844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.855607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.855670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.856060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.856111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.856384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.857299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.858809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.858859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.858900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.860973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.861029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.862582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.862635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.862907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.864247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.864305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.864348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.864391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.866026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.866084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.867371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.867419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.867696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.868641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.868698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.868740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.868782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.870681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.870738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.871829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.871889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.872352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.873640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.873691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.873737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.873780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.875681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.875738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.876954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.877001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.877275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.878257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.878310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.878353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.878395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.879152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.879208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.879602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.879649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.879924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.880883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.880935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.880978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.881020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.882171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.882229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.883546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.883595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.883936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.884831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.884900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.884945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.884988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.886082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.698 [2024-07-12 10:58:40.886141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.887450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.887503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.887843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.888731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.888788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.888835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.888876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.889302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.889351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.889393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.889434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.889751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.890610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.890674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.890719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.890761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.968 [2024-07-12 10:58:40.891249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.891297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.891340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.891383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.891666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.892593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.892644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.892686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.892729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.893136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.893200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.893260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.893308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.893595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.894550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.894601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.894644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.894686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.895101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.895157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.895202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.895245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.895666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.896632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.896684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.896734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.896782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.897203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.897252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.897308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.897366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.897682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.898553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.898951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.898999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.899043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.900172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.900229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.900272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.900313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.900725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.901713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.902118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.902181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.902226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.902750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.902799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.902843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.902885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.903229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.904104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.905356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.905405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.905447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.905935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.905984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.906027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.906077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.906348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.907400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.907453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.908420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.908467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.908927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.909677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.909730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.909772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.910052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.911071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.911473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.911531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.911919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.912339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.912395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.912451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.913939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.914302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.915175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.915579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.915632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.916020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.916433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.916496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.917789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.917836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.918161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.919077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.920696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.920752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.922394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.922906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.923304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.923354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.923398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.923679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.924599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.925812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.925861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.927296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.928870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.928927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.928968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.929521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.929914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.930958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.932585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.932642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.934070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.934507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.935445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.935501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.937000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.937328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.938879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.940309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.940360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.941820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.942364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.944010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.944056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.945668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.946035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.947050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.948323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.948373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.949553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.949964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.951324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.969 [2024-07-12 10:58:40.951373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.952765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.953079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.954474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.954934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.954982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.956158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.956588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.957166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.957217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.958285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.958563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.959673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.960536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.960586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.961372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.961834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.962672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.962725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.963115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.963466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.964523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.964929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.964981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.965370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.965834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.966235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.966301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.966700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.967012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.968064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.968470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.968527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.968915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.969379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.969788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.969857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.970251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.970643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.971767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.972171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.972228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.972624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.973198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.973608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.973672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.974064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.974535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.975811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.976210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.976260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.976656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.977233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.977641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.977694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.978082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.978444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.979427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.979843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.979903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.980288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.980765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.981165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.981222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.981622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.982036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.983044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.983451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.983517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.983905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.984401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.984805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.984864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.985253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.985696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.987340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.987754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.988151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.988202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.988823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.989224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.989272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.989672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.990048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.991296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.991363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.991760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.992167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.992635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.993034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.993088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.993501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.993843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.995234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.995645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.996054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.996464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.997053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.997458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.997515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.997904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.998325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:40.999668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.000083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.000479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.000876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.001385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.001797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.001850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.002239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.002606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.005022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.006693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.007575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.008837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.009268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.010800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.010850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.011742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.012185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.014258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.015074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.016682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.017801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.018355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.018764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.018817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.019206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.019574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.970 [2024-07-12 10:58:41.021134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.022760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.024244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.025601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.026101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.027400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.027450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.028745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.029021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.030849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.031760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.032579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.033597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.034129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.034540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.034701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.037079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.037521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.037916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.038305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.038619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.039464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.040473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.041983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.042356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.043860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.044789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.045692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.046520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.047278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.047690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.048763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.049519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.049840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.051204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.052417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.053263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.054462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.055686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.056872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.057267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.057672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.058063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.060375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.060782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.061172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.062207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.063702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.065184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.066799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.067194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.067593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.070134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.070796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.072295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.073931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.074692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.075087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.075868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.077146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.077432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.079612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.079671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.081039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.082415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.083329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.083738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.083977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.086228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.086287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.087796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.088420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.088838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.090143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.091658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.092946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.093350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.095475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.095538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.096816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.098329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.100243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.101632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.102925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.104430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.104831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.107561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.109180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.109234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.110868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.111989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.113271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.114571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.116091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.116367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.118682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.118741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.120042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.120089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.122037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.122095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.122804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.124077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.124354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.125655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.125717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.126106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.126155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.127780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.129071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.130592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.130644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.130987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.133348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.133430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.134891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.134940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.135853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.137054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.137105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.138358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.138685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.141119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.141185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.142822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.142867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.144915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.144974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.145365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.971 [2024-07-12 10:58:41.145764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.146046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.148582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.148643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.149253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.149300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.149730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.151047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.152608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.152660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:05.972 [2024-07-12 10:58:41.152996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.156429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.156509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.157998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.158049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.159900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.159956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.161210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.161258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.161582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.163227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.163286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.163682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.163737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.165383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.165441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.166740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.166791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.167066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.169285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.169345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.170776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.170831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.171596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.171657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.172050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.172103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.172378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.174890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.174950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.175550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.175600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.177568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.177635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.179151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.179199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.179473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.181747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.181806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.183094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.183142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.185088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.185145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.185808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.185862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.186193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.188339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.188396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.188809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.188875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.190923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.190987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.192566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.192622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.192896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.195159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.195219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.196512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.196560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.197908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.197976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.198367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.198417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.198812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.201256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.201336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.202800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.202851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.204462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.204526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.205811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.205860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.206133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.207668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.240 [2024-07-12 10:58:41.207728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.209002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.209052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.210943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.211002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.212216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.212264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.212546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.215092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.215154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.215604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.215656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.216663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.216723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.217991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.218040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.218358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.220540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.220610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.220659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.222230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.224157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.224218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.224621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.224671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.225012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.225989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.227295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.227345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.227387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.228413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.228475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.229755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.229802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.230077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.231366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.231425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.231467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.231532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.233581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.233645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.235201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.235256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.235535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.236610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.236664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.236706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.236749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.238441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.238505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.240013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.240060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.240416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.241545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.241596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.241640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.241682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.243379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.243435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.244955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.245004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.245360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.246310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.246367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.246410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.246452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.247956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.248022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.248415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.248473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.248979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.250147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.250213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.250267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.250311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.251187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.251250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.252904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.252960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.253243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.254310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.254366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.254409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.254449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.255214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.255273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.255679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.255729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.256006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.257022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.257082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.257133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.257179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.258789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.258846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.260148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.260196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.260562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.261618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.261670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.241 [2024-07-12 10:58:41.261714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.261758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.262293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.262342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.262385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.262427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.262804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.263904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.263955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.263996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.264038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.264590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.264641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.264686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.264730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.265122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.266146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.266200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.266243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.266283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.266701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.266757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.266800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.266850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.267164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.268160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.268212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.268259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.268301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.268866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.268916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.268963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.269006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.269429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.270476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.272008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.272056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.272097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.272590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.272640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.272683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.272729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.273088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.274029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.274438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.274503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.274546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.276506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.276570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.276612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.276660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.276932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.277962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.279208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.279261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.279303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.279904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.279964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.280007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.280049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.280374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.281328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.281387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.282739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.282786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.283192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.283242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.283292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.283336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.283615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.284690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.285093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.285143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.286124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.286587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.288106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.288155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.288196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.288539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.289543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.291053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.291110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.291508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.291929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.291986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.292032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.292886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.293216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.294204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.295496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.295545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.296328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.296959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.297024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.297415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.297462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.297740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.298689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.299581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.242 [2024-07-12 10:58:41.299632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.300899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.301348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.302425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.302501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.302558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.302979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.304148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.305564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.305616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.306157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.307938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.307997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.308057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.308447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.308772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.309738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.310479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.310538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.311445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.311955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.312355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.312402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.312800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.313142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.314188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.314598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.314652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.315042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.315573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.315970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.316019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.316408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.316757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.317858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.318260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.318327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.318725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.319214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.319621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.319671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.320061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.320416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.321655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.322057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.322116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.322522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.323097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.323505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.323555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.323945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.324311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.325641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.326041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.326099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.326502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.327033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.327433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.327490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.327885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.328238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.329412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.329816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.329876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.330265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.330798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.331200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.331251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.331645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.332037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.333135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.333550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.333599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.333988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.334490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.334887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.334937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.335344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.335713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.336765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.337171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.337219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.337615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.338105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.338511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.338563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.338969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.339358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.340403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.340810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.340860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.341249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.341775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.342176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.342226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.342626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.343006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.344104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.344523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.344571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.243 [2024-07-12 10:58:41.344964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.345500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.345904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.345969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.346356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.346719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.347717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.349006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.350515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.350566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.351153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.352423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.352473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.353754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.354087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.356448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.356519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.358056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.358454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.358930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.359418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.359469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.360645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.361001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.362111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.362514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.363100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.364593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.365005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.365806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.365859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.367360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.367718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.370407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.371909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.372504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.373745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.374157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.374573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.374625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.375008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.375390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.377854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.378254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.378644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.379328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.379752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.380425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.380475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.381450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.381901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.383749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.384958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.385580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.385987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.386575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.388028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.388087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.389558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.389912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.391276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.391680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.392073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.393630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.394098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.395663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.395717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.396105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.396533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.398392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.399316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.399711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.400097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.400516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.401512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.401565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.402198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.402474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.403873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.244 [2024-07-12 10:58:41.404277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.405837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.406241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.406702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.407102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.407351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.410010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.411073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.412346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.413235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.413656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.414044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.415510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.416823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.417101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.419521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.421022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.421416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.421809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.423435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.424814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.426440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.427367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.427662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.430151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.245 [2024-07-12 10:58:41.430562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.504 [2024-07-12 10:58:41.430956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.504 [2024-07-12 10:58:41.432075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.504 [2024-07-12 10:58:41.433763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.504 [2024-07-12 10:58:41.435275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.504 [2024-07-12 10:58:41.436190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.437788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.438089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.439555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.439958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.440850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.442125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.444048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.445256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.446579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.447845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.448156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.449505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.449561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.450342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.451615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.453508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.454862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.456021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.457295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.457633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.458935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.458992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.459628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.460897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.462867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.464338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.464527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.467043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.467101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.468395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.468786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.469414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.471118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.472641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.473998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.474277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.476716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.478262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.478310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.479604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.480534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.482080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.483454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.484740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.485018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.487608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.487668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.489179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.489225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.489982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.490387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.491941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.493334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.493652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.496042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.496099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.497408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.497455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.499153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.499208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.499615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.500002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.500281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.502751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.502815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.503682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.503728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.505442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.506964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.508037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.508100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.508535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.511323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.511387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.513017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.513065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.514418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.515696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.515745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.517022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.517302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.519019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.519077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.520354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.520401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.522332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.522388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.523653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.524886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.525203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.527024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.527081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.527467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.527517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.527954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.529230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.530526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.505 [2024-07-12 10:58:41.530575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.506 [2024-07-12 10:58:41.530850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.506 [2024-07-12 10:58:41.531824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.506 [2024-07-12 10:58:41.531911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:06.506 [2024-07-12 10:58:41.536737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:06.765 00:32:06.765 Latency(us) 00:32:06.765 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:06.765 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:06.765 Verification LBA range: start 0x0 length 0x100 00:32:06.765 crypto_ram : 5.73 44.68 2.79 0.00 0.00 2766344.01 77959.35 2363399.12 00:32:06.765 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:06.765 Verification LBA range: start 0x100 length 0x100 00:32:06.765 crypto_ram : 5.77 44.34 2.77 0.00 0.00 2804419.01 59723.24 2494699.07 00:32:06.765 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:06.765 Verification LBA range: start 0x0 length 0x100 00:32:06.766 crypto_ram2 : 5.73 44.67 2.79 0.00 0.00 2663783.74 77503.44 2305043.59 00:32:06.766 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:06.766 Verification LBA range: start 0x100 length 0x100 00:32:06.766 crypto_ram2 : 5.77 44.34 2.77 0.00 0.00 2700140.19 59267.34 2494699.07 00:32:06.766 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:06.766 Verification LBA range: start 0x0 length 0x100 00:32:06.766 crypto_ram3 : 5.58 310.05 19.38 0.00 0.00 369976.10 31913.18 561672.01 00:32:06.766 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:06.766 Verification LBA range: start 0x100 length 0x100 00:32:06.766 crypto_ram3 : 5.60 292.88 18.30 0.00 0.00 391235.74 66561.78 576260.90 00:32:06.766 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:06.766 Verification LBA range: start 0x0 length 0x100 00:32:06.766 crypto_ram4 : 5.66 322.50 20.16 0.00 0.00 345870.43 3091.59 496022.04 00:32:06.766 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:06.766 Verification LBA range: start 0x100 length 0x100 00:32:06.766 crypto_ram4 : 5.67 304.10 19.01 0.00 0.00 366208.85 5898.24 393899.85 00:32:06.766 =================================================================================================================== 00:32:06.766 Total : 1407.56 87.97 0.00 0.00 672774.72 3091.59 2494699.07 00:32:07.334 00:32:07.334 real 0m9.006s 00:32:07.334 user 0m17.006s 00:32:07.334 sys 0m0.507s 00:32:07.334 10:58:42 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:07.334 10:58:42 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:32:07.334 ************************************ 00:32:07.334 END TEST bdev_verify_big_io 00:32:07.334 ************************************ 00:32:07.334 10:58:42 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:07.334 10:58:42 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:07.334 10:58:42 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:07.334 10:58:42 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:07.334 10:58:42 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:07.334 ************************************ 00:32:07.334 START TEST bdev_write_zeroes 00:32:07.334 ************************************ 00:32:07.334 10:58:42 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:07.334 [2024-07-12 10:58:42.447016] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:07.334 [2024-07-12 10:58:42.447076] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2203437 ] 00:32:07.594 [2024-07-12 10:58:42.576657] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:07.594 [2024-07-12 10:58:42.677592] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:07.594 [2024-07-12 10:58:42.698877] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:07.594 [2024-07-12 10:58:42.706904] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:07.594 [2024-07-12 10:58:42.714923] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:07.853 [2024-07-12 10:58:42.831581] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:10.386 [2024-07-12 10:58:45.073634] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:10.386 [2024-07-12 10:58:45.073705] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:10.386 [2024-07-12 10:58:45.073720] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:10.386 [2024-07-12 10:58:45.081654] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:10.386 [2024-07-12 10:58:45.081684] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:10.386 [2024-07-12 10:58:45.081696] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:10.386 [2024-07-12 10:58:45.089676] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:10.386 [2024-07-12 10:58:45.089696] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:10.386 [2024-07-12 10:58:45.089707] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:10.386 [2024-07-12 10:58:45.097695] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:10.386 [2024-07-12 10:58:45.097714] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:10.386 [2024-07-12 10:58:45.097725] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:10.386 Running I/O for 1 seconds... 00:32:11.321 00:32:11.321 Latency(us) 00:32:11.321 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:11.321 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:11.321 crypto_ram : 1.02 1966.17 7.68 0.00 0.00 64722.93 5442.34 77503.44 00:32:11.321 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:11.321 crypto_ram2 : 1.03 1971.90 7.70 0.00 0.00 64179.41 5413.84 72488.51 00:32:11.321 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:11.321 crypto_ram3 : 1.02 15114.56 59.04 0.00 0.00 8344.58 2478.97 10827.69 00:32:11.321 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:11.321 crypto_ram4 : 1.02 15099.27 58.98 0.00 0.00 8316.61 2478.97 8719.14 00:32:11.321 =================================================================================================================== 00:32:11.321 Total : 34151.90 133.41 0.00 0.00 14829.01 2478.97 77503.44 00:32:11.579 00:32:11.579 real 0m4.254s 00:32:11.579 user 0m3.830s 00:32:11.579 sys 0m0.381s 00:32:11.579 10:58:46 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:11.579 10:58:46 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:32:11.579 ************************************ 00:32:11.579 END TEST bdev_write_zeroes 00:32:11.579 ************************************ 00:32:11.579 10:58:46 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:11.579 10:58:46 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:11.579 10:58:46 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:11.579 10:58:46 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:11.579 10:58:46 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:11.579 ************************************ 00:32:11.579 START TEST bdev_json_nonenclosed 00:32:11.579 ************************************ 00:32:11.580 10:58:46 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:11.839 [2024-07-12 10:58:46.799034] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:11.839 [2024-07-12 10:58:46.799096] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204000 ] 00:32:11.839 [2024-07-12 10:58:46.917949] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:11.839 [2024-07-12 10:58:47.019082] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:11.839 [2024-07-12 10:58:47.019151] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:32:11.839 [2024-07-12 10:58:47.019173] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:11.839 [2024-07-12 10:58:47.019185] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:12.098 00:32:12.098 real 0m0.387s 00:32:12.098 user 0m0.229s 00:32:12.098 sys 0m0.155s 00:32:12.098 10:58:47 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:32:12.098 10:58:47 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:12.098 10:58:47 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:32:12.098 ************************************ 00:32:12.098 END TEST bdev_json_nonenclosed 00:32:12.098 ************************************ 00:32:12.098 10:58:47 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:32:12.098 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:32:12.098 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:12.098 10:58:47 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:12.098 10:58:47 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:12.098 10:58:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:12.098 ************************************ 00:32:12.098 START TEST bdev_json_nonarray 00:32:12.098 ************************************ 00:32:12.098 10:58:47 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:12.098 [2024-07-12 10:58:47.254139] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:12.098 [2024-07-12 10:58:47.254212] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204157 ] 00:32:12.356 [2024-07-12 10:58:47.384203] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:12.356 [2024-07-12 10:58:47.491896] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:12.356 [2024-07-12 10:58:47.491975] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:32:12.356 [2024-07-12 10:58:47.491996] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:12.356 [2024-07-12 10:58:47.492010] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:12.615 00:32:12.615 real 0m0.413s 00:32:12.615 user 0m0.246s 00:32:12.615 sys 0m0.164s 00:32:12.615 10:58:47 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:32:12.615 10:58:47 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:12.615 10:58:47 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:32:12.615 ************************************ 00:32:12.615 END TEST bdev_json_nonarray 00:32:12.615 ************************************ 00:32:12.615 10:58:47 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:32:12.615 10:58:47 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:32:12.615 00:32:12.615 real 1m11.670s 00:32:12.615 user 2m39.552s 00:32:12.615 sys 0m9.015s 00:32:12.615 10:58:47 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:12.615 10:58:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:12.615 ************************************ 00:32:12.615 END TEST blockdev_crypto_aesni 00:32:12.615 ************************************ 00:32:12.615 10:58:47 -- common/autotest_common.sh@1142 -- # return 0 00:32:12.615 10:58:47 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:32:12.615 10:58:47 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:12.615 10:58:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:12.615 10:58:47 -- common/autotest_common.sh@10 -- # set +x 00:32:12.615 ************************************ 00:32:12.615 START TEST blockdev_crypto_sw 00:32:12.615 ************************************ 00:32:12.615 10:58:47 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:32:12.874 * Looking for test storage... 00:32:12.874 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2204232 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 2204232 00:32:12.874 10:58:47 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:12.874 10:58:47 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 2204232 ']' 00:32:12.874 10:58:47 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:12.874 10:58:47 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:12.874 10:58:47 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:12.874 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:12.874 10:58:47 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:12.874 10:58:47 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:12.874 [2024-07-12 10:58:47.930635] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:12.874 [2024-07-12 10:58:47.930698] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204232 ] 00:32:12.874 [2024-07-12 10:58:48.044200] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:13.134 [2024-07-12 10:58:48.149951] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:13.702 10:58:48 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:13.702 10:58:48 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:32:13.702 10:58:48 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:32:13.702 10:58:48 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:32:13.702 10:58:48 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:32:13.702 10:58:48 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:13.702 10:58:48 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:13.961 Malloc0 00:32:13.961 Malloc1 00:32:13.961 true 00:32:13.961 true 00:32:13.961 true 00:32:13.961 [2024-07-12 10:58:49.144248] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:13.961 crypto_ram 00:32:13.961 [2024-07-12 10:58:49.152280] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:14.220 crypto_ram2 00:32:14.220 [2024-07-12 10:58:49.160304] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:14.220 crypto_ram3 00:32:14.220 [ 00:32:14.220 { 00:32:14.220 "name": "Malloc1", 00:32:14.220 "aliases": [ 00:32:14.220 "c4e40d93-d6fb-46a1-9f61-323c20b59e2f" 00:32:14.220 ], 00:32:14.220 "product_name": "Malloc disk", 00:32:14.220 "block_size": 4096, 00:32:14.220 "num_blocks": 4096, 00:32:14.220 "uuid": "c4e40d93-d6fb-46a1-9f61-323c20b59e2f", 00:32:14.220 "assigned_rate_limits": { 00:32:14.220 "rw_ios_per_sec": 0, 00:32:14.220 "rw_mbytes_per_sec": 0, 00:32:14.220 "r_mbytes_per_sec": 0, 00:32:14.220 "w_mbytes_per_sec": 0 00:32:14.220 }, 00:32:14.220 "claimed": true, 00:32:14.220 "claim_type": "exclusive_write", 00:32:14.220 "zoned": false, 00:32:14.220 "supported_io_types": { 00:32:14.220 "read": true, 00:32:14.220 "write": true, 00:32:14.220 "unmap": true, 00:32:14.220 "flush": true, 00:32:14.220 "reset": true, 00:32:14.220 "nvme_admin": false, 00:32:14.220 "nvme_io": false, 00:32:14.220 "nvme_io_md": false, 00:32:14.220 "write_zeroes": true, 00:32:14.220 "zcopy": true, 00:32:14.220 "get_zone_info": false, 00:32:14.220 "zone_management": false, 00:32:14.220 "zone_append": false, 00:32:14.220 "compare": false, 00:32:14.220 "compare_and_write": false, 00:32:14.220 "abort": true, 00:32:14.220 "seek_hole": false, 00:32:14.220 "seek_data": false, 00:32:14.220 "copy": true, 00:32:14.220 "nvme_iov_md": false 00:32:14.220 }, 00:32:14.220 "memory_domains": [ 00:32:14.220 { 00:32:14.220 "dma_device_id": "system", 00:32:14.220 "dma_device_type": 1 00:32:14.220 }, 00:32:14.220 { 00:32:14.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:14.220 "dma_device_type": 2 00:32:14.220 } 00:32:14.220 ], 00:32:14.220 "driver_specific": {} 00:32:14.220 } 00:32:14.220 ] 00:32:14.220 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:14.220 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:32:14.220 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:14.220 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:14.220 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:14.220 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:32:14.220 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:32:14.220 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:14.220 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:14.220 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:14.220 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0e6eac66-2a89-5a2e-8851-acbcde923467"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "0e6eac66-2a89-5a2e-8851-acbcde923467",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "84921983-7b93-5409-8db1-3709dab1869a"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "84921983-7b93-5409-8db1-3709dab1869a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:32:14.221 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 2204232 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 2204232 ']' 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 2204232 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2204232 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2204232' 00:32:14.221 killing process with pid 2204232 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 2204232 00:32:14.221 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 2204232 00:32:14.789 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:14.789 10:58:49 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:14.789 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:32:14.789 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:14.789 10:58:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:14.789 ************************************ 00:32:14.789 START TEST bdev_hello_world 00:32:14.789 ************************************ 00:32:14.789 10:58:49 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:14.789 [2024-07-12 10:58:49.846774] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:14.789 [2024-07-12 10:58:49.846827] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204580 ] 00:32:14.789 [2024-07-12 10:58:49.959794] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:15.049 [2024-07-12 10:58:50.068642] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:15.308 [2024-07-12 10:58:50.243830] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:15.308 [2024-07-12 10:58:50.243895] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:15.308 [2024-07-12 10:58:50.243910] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:15.308 [2024-07-12 10:58:50.251847] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:15.308 [2024-07-12 10:58:50.251866] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:15.308 [2024-07-12 10:58:50.251878] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:15.308 [2024-07-12 10:58:50.259868] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:15.308 [2024-07-12 10:58:50.259886] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:15.308 [2024-07-12 10:58:50.259898] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:15.308 [2024-07-12 10:58:50.300339] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:15.308 [2024-07-12 10:58:50.300380] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:15.308 [2024-07-12 10:58:50.300399] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:15.308 [2024-07-12 10:58:50.302427] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:15.308 [2024-07-12 10:58:50.302519] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:15.308 [2024-07-12 10:58:50.302536] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:15.308 [2024-07-12 10:58:50.302573] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:15.308 00:32:15.308 [2024-07-12 10:58:50.302591] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:15.567 00:32:15.567 real 0m0.730s 00:32:15.567 user 0m0.496s 00:32:15.567 sys 0m0.215s 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:15.567 ************************************ 00:32:15.567 END TEST bdev_hello_world 00:32:15.567 ************************************ 00:32:15.567 10:58:50 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:15.567 10:58:50 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:15.567 10:58:50 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:15.567 10:58:50 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:15.567 10:58:50 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:15.567 ************************************ 00:32:15.567 START TEST bdev_bounds 00:32:15.567 ************************************ 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2204623 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2204623' 00:32:15.567 Process bdevio pid: 2204623 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2204623 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2204623 ']' 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:15.567 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:15.567 10:58:50 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:15.567 [2024-07-12 10:58:50.670671] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:15.567 [2024-07-12 10:58:50.670742] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204623 ] 00:32:15.826 [2024-07-12 10:58:50.800275] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:15.826 [2024-07-12 10:58:50.909334] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:15.826 [2024-07-12 10:58:50.909419] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:15.827 [2024-07-12 10:58:50.909424] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:16.085 [2024-07-12 10:58:51.091939] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:16.085 [2024-07-12 10:58:51.092012] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:16.086 [2024-07-12 10:58:51.092029] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:16.086 [2024-07-12 10:58:51.099960] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:16.086 [2024-07-12 10:58:51.099985] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:16.086 [2024-07-12 10:58:51.099997] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:16.086 [2024-07-12 10:58:51.107982] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:16.086 [2024-07-12 10:58:51.108002] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:16.086 [2024-07-12 10:58:51.108014] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:16.654 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:16.654 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:16.654 10:58:51 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:16.654 I/O targets: 00:32:16.654 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:32:16.654 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:32:16.654 00:32:16.654 00:32:16.654 CUnit - A unit testing framework for C - Version 2.1-3 00:32:16.654 http://cunit.sourceforge.net/ 00:32:16.654 00:32:16.654 00:32:16.654 Suite: bdevio tests on: crypto_ram3 00:32:16.654 Test: blockdev write read block ...passed 00:32:16.654 Test: blockdev write zeroes read block ...passed 00:32:16.654 Test: blockdev write zeroes read no split ...passed 00:32:16.654 Test: blockdev write zeroes read split ...passed 00:32:16.654 Test: blockdev write zeroes read split partial ...passed 00:32:16.654 Test: blockdev reset ...passed 00:32:16.654 Test: blockdev write read 8 blocks ...passed 00:32:16.654 Test: blockdev write read size > 128k ...passed 00:32:16.654 Test: blockdev write read invalid size ...passed 00:32:16.654 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:16.654 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:16.654 Test: blockdev write read max offset ...passed 00:32:16.654 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:16.654 Test: blockdev writev readv 8 blocks ...passed 00:32:16.654 Test: blockdev writev readv 30 x 1block ...passed 00:32:16.654 Test: blockdev writev readv block ...passed 00:32:16.654 Test: blockdev writev readv size > 128k ...passed 00:32:16.654 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:16.654 Test: blockdev comparev and writev ...passed 00:32:16.654 Test: blockdev nvme passthru rw ...passed 00:32:16.654 Test: blockdev nvme passthru vendor specific ...passed 00:32:16.654 Test: blockdev nvme admin passthru ...passed 00:32:16.654 Test: blockdev copy ...passed 00:32:16.654 Suite: bdevio tests on: crypto_ram 00:32:16.654 Test: blockdev write read block ...passed 00:32:16.654 Test: blockdev write zeroes read block ...passed 00:32:16.654 Test: blockdev write zeroes read no split ...passed 00:32:16.654 Test: blockdev write zeroes read split ...passed 00:32:16.654 Test: blockdev write zeroes read split partial ...passed 00:32:16.654 Test: blockdev reset ...passed 00:32:16.654 Test: blockdev write read 8 blocks ...passed 00:32:16.654 Test: blockdev write read size > 128k ...passed 00:32:16.654 Test: blockdev write read invalid size ...passed 00:32:16.654 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:16.654 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:16.654 Test: blockdev write read max offset ...passed 00:32:16.654 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:16.654 Test: blockdev writev readv 8 blocks ...passed 00:32:16.654 Test: blockdev writev readv 30 x 1block ...passed 00:32:16.654 Test: blockdev writev readv block ...passed 00:32:16.654 Test: blockdev writev readv size > 128k ...passed 00:32:16.654 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:16.654 Test: blockdev comparev and writev ...passed 00:32:16.654 Test: blockdev nvme passthru rw ...passed 00:32:16.655 Test: blockdev nvme passthru vendor specific ...passed 00:32:16.655 Test: blockdev nvme admin passthru ...passed 00:32:16.655 Test: blockdev copy ...passed 00:32:16.655 00:32:16.655 Run Summary: Type Total Ran Passed Failed Inactive 00:32:16.655 suites 2 2 n/a 0 0 00:32:16.655 tests 46 46 46 0 0 00:32:16.655 asserts 260 260 260 0 n/a 00:32:16.655 00:32:16.655 Elapsed time = 0.084 seconds 00:32:16.655 0 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2204623 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2204623 ']' 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2204623 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2204623 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2204623' 00:32:16.655 killing process with pid 2204623 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2204623 00:32:16.655 10:58:51 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2204623 00:32:16.914 10:58:52 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:16.914 00:32:16.914 real 0m1.429s 00:32:16.914 user 0m3.666s 00:32:16.914 sys 0m0.394s 00:32:16.914 10:58:52 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:16.914 10:58:52 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:16.914 ************************************ 00:32:16.914 END TEST bdev_bounds 00:32:16.914 ************************************ 00:32:16.914 10:58:52 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:16.914 10:58:52 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:32:16.914 10:58:52 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:16.914 10:58:52 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:16.914 10:58:52 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:17.173 ************************************ 00:32:17.173 START TEST bdev_nbd 00:32:17.173 ************************************ 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2204831 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2204831 /var/tmp/spdk-nbd.sock 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2204831 ']' 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:17.173 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:17.173 10:58:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:17.173 [2024-07-12 10:58:52.177772] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:17.173 [2024-07-12 10:58:52.177841] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:17.173 [2024-07-12 10:58:52.308727] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:17.432 [2024-07-12 10:58:52.408228] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:17.432 [2024-07-12 10:58:52.594228] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:17.432 [2024-07-12 10:58:52.594289] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:17.432 [2024-07-12 10:58:52.594305] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:17.432 [2024-07-12 10:58:52.602247] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:17.432 [2024-07-12 10:58:52.602268] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:17.432 [2024-07-12 10:58:52.602279] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:17.432 [2024-07-12 10:58:52.610267] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:17.432 [2024-07-12 10:58:52.610287] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:17.432 [2024-07-12 10:58:52.610299] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:18.376 1+0 records in 00:32:18.376 1+0 records out 00:32:18.376 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280317 s, 14.6 MB/s 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:18.376 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:18.944 1+0 records in 00:32:18.944 1+0 records out 00:32:18.944 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000367122 s, 11.2 MB/s 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:18.944 10:58:53 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:18.944 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:18.944 { 00:32:18.944 "nbd_device": "/dev/nbd0", 00:32:18.944 "bdev_name": "crypto_ram" 00:32:18.944 }, 00:32:18.944 { 00:32:18.944 "nbd_device": "/dev/nbd1", 00:32:18.944 "bdev_name": "crypto_ram3" 00:32:18.944 } 00:32:18.944 ]' 00:32:18.944 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:18.944 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:18.944 { 00:32:18.944 "nbd_device": "/dev/nbd0", 00:32:18.944 "bdev_name": "crypto_ram" 00:32:18.944 }, 00:32:18.944 { 00:32:18.944 "nbd_device": "/dev/nbd1", 00:32:18.944 "bdev_name": "crypto_ram3" 00:32:18.944 } 00:32:18.944 ]' 00:32:18.944 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:19.203 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:19.461 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:19.462 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:19.462 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:19.462 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:19.462 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:19.721 10:58:54 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:19.980 /dev/nbd0 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:19.980 1+0 records in 00:32:19.980 1+0 records out 00:32:19.980 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232603 s, 17.6 MB/s 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:19.980 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:32:20.239 /dev/nbd1 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:20.239 1+0 records in 00:32:20.239 1+0 records out 00:32:20.239 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315751 s, 13.0 MB/s 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:20.239 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:20.240 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:20.240 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:20.240 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:20.499 { 00:32:20.499 "nbd_device": "/dev/nbd0", 00:32:20.499 "bdev_name": "crypto_ram" 00:32:20.499 }, 00:32:20.499 { 00:32:20.499 "nbd_device": "/dev/nbd1", 00:32:20.499 "bdev_name": "crypto_ram3" 00:32:20.499 } 00:32:20.499 ]' 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:20.499 { 00:32:20.499 "nbd_device": "/dev/nbd0", 00:32:20.499 "bdev_name": "crypto_ram" 00:32:20.499 }, 00:32:20.499 { 00:32:20.499 "nbd_device": "/dev/nbd1", 00:32:20.499 "bdev_name": "crypto_ram3" 00:32:20.499 } 00:32:20.499 ]' 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:20.499 /dev/nbd1' 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:20.499 /dev/nbd1' 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:32:20.499 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:20.758 256+0 records in 00:32:20.758 256+0 records out 00:32:20.758 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107209 s, 97.8 MB/s 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:20.758 256+0 records in 00:32:20.758 256+0 records out 00:32:20.758 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0206551 s, 50.8 MB/s 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:20.758 256+0 records in 00:32:20.758 256+0 records out 00:32:20.758 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0422688 s, 24.8 MB/s 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:20.758 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:21.017 10:58:55 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:21.277 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:21.537 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:21.537 malloc_lvol_verify 00:32:21.796 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:21.796 10e06fbc-e3ca-4d4b-937b-f5de98e46069 00:32:21.796 10:58:56 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:22.055 588b85f5-01d8-47ef-b4b5-9ce7e6e71b4f 00:32:22.055 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:22.314 /dev/nbd0 00:32:22.314 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:22.314 mke2fs 1.46.5 (30-Dec-2021) 00:32:22.314 Discarding device blocks: 0/4096 done 00:32:22.314 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:22.314 00:32:22.314 Allocating group tables: 0/1 done 00:32:22.314 Writing inode tables: 0/1 done 00:32:22.314 Creating journal (1024 blocks): done 00:32:22.314 Writing superblocks and filesystem accounting information: 0/1 done 00:32:22.314 00:32:22.314 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:22.314 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:22.314 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:22.314 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:22.314 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:22.314 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:22.314 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:22.314 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2204831 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2204831 ']' 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2204831 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2204831 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2204831' 00:32:22.574 killing process with pid 2204831 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2204831 00:32:22.574 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2204831 00:32:22.833 10:58:57 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:22.833 00:32:22.833 real 0m5.852s 00:32:22.833 user 0m8.283s 00:32:22.833 sys 0m2.375s 00:32:22.833 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:22.833 10:58:57 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:22.833 ************************************ 00:32:22.833 END TEST bdev_nbd 00:32:22.833 ************************************ 00:32:22.833 10:58:58 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:22.833 10:58:58 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:22.833 10:58:58 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:32:22.833 10:58:58 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:32:22.833 10:58:58 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:22.833 10:58:58 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:22.833 10:58:58 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:22.833 10:58:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:23.092 ************************************ 00:32:23.092 START TEST bdev_fio 00:32:23.092 ************************************ 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:23.092 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:23.092 10:58:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:23.092 ************************************ 00:32:23.093 START TEST bdev_fio_rw_verify 00:32:23.093 ************************************ 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:23.093 10:58:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:23.352 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:23.352 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:23.352 fio-3.35 00:32:23.352 Starting 2 threads 00:32:35.563 00:32:35.563 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2205936: Fri Jul 12 10:59:09 2024 00:32:35.563 read: IOPS=14.8k, BW=57.9MiB/s (60.7MB/s)(579MiB/10001msec) 00:32:35.563 slat (usec): min=16, max=595, avg=29.57, stdev=11.04 00:32:35.563 clat (usec): min=18, max=1022, avg=214.16, stdev=93.57 00:32:35.563 lat (usec): min=40, max=1262, avg=243.74, stdev=99.46 00:32:35.563 clat percentiles (usec): 00:32:35.563 | 50.000th=[ 198], 99.000th=[ 486], 99.900th=[ 562], 99.990th=[ 619], 00:32:35.563 | 99.999th=[ 1020] 00:32:35.563 write: IOPS=17.8k, BW=69.6MiB/s (72.9MB/s)(659MiB/9477msec); 0 zone resets 00:32:35.563 slat (usec): min=18, max=1999, avg=49.84, stdev=17.36 00:32:35.563 clat (usec): min=37, max=2374, avg=286.79, stdev=134.90 00:32:35.563 lat (usec): min=68, max=2406, avg=336.63, stdev=142.74 00:32:35.563 clat percentiles (usec): 00:32:35.563 | 50.000th=[ 269], 99.000th=[ 635], 99.900th=[ 676], 99.990th=[ 783], 00:32:35.563 | 99.999th=[ 2343] 00:32:35.563 bw ( KiB/s): min=62208, max=72824, per=94.81%, avg=67544.84, stdev=1584.08, samples=38 00:32:35.563 iops : min=15552, max=18206, avg=16886.21, stdev=396.02, samples=38 00:32:35.563 lat (usec) : 20=0.01%, 50=0.01%, 100=3.95%, 250=53.79%, 500=36.97% 00:32:35.563 lat (usec) : 750=5.27%, 1000=0.01% 00:32:35.563 lat (msec) : 2=0.01%, 4=0.01% 00:32:35.563 cpu : usr=99.44%, sys=0.01%, ctx=31, majf=0, minf=450 00:32:35.563 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:35.563 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:35.563 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:35.563 issued rwts: total=148189,168785,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:35.563 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:35.563 00:32:35.563 Run status group 0 (all jobs): 00:32:35.563 READ: bw=57.9MiB/s (60.7MB/s), 57.9MiB/s-57.9MiB/s (60.7MB/s-60.7MB/s), io=579MiB (607MB), run=10001-10001msec 00:32:35.563 WRITE: bw=69.6MiB/s (72.9MB/s), 69.6MiB/s-69.6MiB/s (72.9MB/s-72.9MB/s), io=659MiB (691MB), run=9477-9477msec 00:32:35.563 00:32:35.563 real 0m11.057s 00:32:35.563 user 0m23.306s 00:32:35.563 sys 0m0.316s 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:35.563 ************************************ 00:32:35.563 END TEST bdev_fio_rw_verify 00:32:35.563 ************************************ 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:32:35.563 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0e6eac66-2a89-5a2e-8851-acbcde923467"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "0e6eac66-2a89-5a2e-8851-acbcde923467",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "84921983-7b93-5409-8db1-3709dab1869a"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "84921983-7b93-5409-8db1-3709dab1869a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:32:35.564 crypto_ram3 ]] 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0e6eac66-2a89-5a2e-8851-acbcde923467"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "0e6eac66-2a89-5a2e-8851-acbcde923467",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "84921983-7b93-5409-8db1-3709dab1869a"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "84921983-7b93-5409-8db1-3709dab1869a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:35.564 ************************************ 00:32:35.564 START TEST bdev_fio_trim 00:32:35.564 ************************************ 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:35.564 10:59:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:35.564 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:35.564 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:35.564 fio-3.35 00:32:35.564 Starting 2 threads 00:32:45.618 00:32:45.618 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2207444: Fri Jul 12 10:59:20 2024 00:32:45.618 write: IOPS=34.9k, BW=136MiB/s (143MB/s)(1362MiB/10001msec); 0 zone resets 00:32:45.618 slat (usec): min=14, max=564, avg=25.04, stdev= 7.77 00:32:45.618 clat (usec): min=37, max=2630, avg=187.56, stdev=103.83 00:32:45.618 lat (usec): min=52, max=2658, avg=212.60, stdev=108.44 00:32:45.618 clat percentiles (usec): 00:32:45.618 | 50.000th=[ 159], 99.000th=[ 412], 99.900th=[ 461], 99.990th=[ 873], 00:32:45.618 | 99.999th=[ 2573] 00:32:45.618 bw ( KiB/s): min=94592, max=157304, per=100.00%, avg=141818.11, stdev=13862.10, samples=38 00:32:45.618 iops : min=23648, max=39326, avg=35454.63, stdev=3465.39, samples=38 00:32:45.618 trim: IOPS=34.9k, BW=136MiB/s (143MB/s)(1362MiB/10001msec); 0 zone resets 00:32:45.618 slat (usec): min=6, max=224, avg=12.14, stdev= 4.30 00:32:45.618 clat (usec): min=51, max=2438, avg=124.40, stdev=52.15 00:32:45.618 lat (usec): min=59, max=2454, avg=136.54, stdev=54.24 00:32:45.618 clat percentiles (usec): 00:32:45.618 | 50.000th=[ 120], 99.000th=[ 322], 99.900th=[ 371], 99.990th=[ 424], 00:32:45.618 | 99.999th=[ 742] 00:32:45.618 bw ( KiB/s): min=94592, max=157304, per=100.00%, avg=141819.79, stdev=13861.40, samples=38 00:32:45.618 iops : min=23648, max=39326, avg=35455.05, stdev=3465.36, samples=38 00:32:45.618 lat (usec) : 50=3.11%, 100=28.19%, 250=49.89%, 500=18.77%, 750=0.02% 00:32:45.618 lat (usec) : 1000=0.01% 00:32:45.618 lat (msec) : 2=0.01%, 4=0.01% 00:32:45.618 cpu : usr=99.56%, sys=0.01%, ctx=39, majf=0, minf=302 00:32:45.618 IO depths : 1=7.2%, 2=17.0%, 4=60.6%, 8=15.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:45.618 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:45.618 complete : 0=0.0%, 4=86.8%, 8=13.2%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:45.618 issued rwts: total=0,348562,348562,0 short=0,0,0,0 dropped=0,0,0,0 00:32:45.618 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:45.618 00:32:45.618 Run status group 0 (all jobs): 00:32:45.618 WRITE: bw=136MiB/s (143MB/s), 136MiB/s-136MiB/s (143MB/s-143MB/s), io=1362MiB (1428MB), run=10001-10001msec 00:32:45.618 TRIM: bw=136MiB/s (143MB/s), 136MiB/s-136MiB/s (143MB/s-143MB/s), io=1362MiB (1428MB), run=10001-10001msec 00:32:45.618 00:32:45.618 real 0m11.150s 00:32:45.618 user 0m23.800s 00:32:45.618 sys 0m0.385s 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:45.618 ************************************ 00:32:45.618 END TEST bdev_fio_trim 00:32:45.618 ************************************ 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:45.618 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:45.618 00:32:45.618 real 0m22.524s 00:32:45.618 user 0m47.271s 00:32:45.618 sys 0m0.870s 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:45.618 ************************************ 00:32:45.618 END TEST bdev_fio 00:32:45.618 ************************************ 00:32:45.618 10:59:20 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:45.618 10:59:20 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:45.618 10:59:20 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:45.618 10:59:20 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:45.618 10:59:20 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:45.618 10:59:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:45.618 ************************************ 00:32:45.618 START TEST bdev_verify 00:32:45.618 ************************************ 00:32:45.618 10:59:20 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:45.618 [2024-07-12 10:59:20.686443] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:45.618 [2024-07-12 10:59:20.686515] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2208856 ] 00:32:45.618 [2024-07-12 10:59:20.803550] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:45.878 [2024-07-12 10:59:20.902330] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:45.878 [2024-07-12 10:59:20.902336] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:46.137 [2024-07-12 10:59:21.073283] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:46.137 [2024-07-12 10:59:21.073350] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:46.137 [2024-07-12 10:59:21.073364] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:46.137 [2024-07-12 10:59:21.081300] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:46.137 [2024-07-12 10:59:21.081320] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:46.137 [2024-07-12 10:59:21.081331] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:46.137 [2024-07-12 10:59:21.089322] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:46.137 [2024-07-12 10:59:21.089341] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:46.137 [2024-07-12 10:59:21.089352] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:46.137 Running I/O for 5 seconds... 00:32:51.410 00:32:51.410 Latency(us) 00:32:51.410 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:51.410 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:51.410 Verification LBA range: start 0x0 length 0x800 00:32:51.410 crypto_ram : 5.02 5457.12 21.32 0.00 0.00 23355.53 1617.03 30773.43 00:32:51.410 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:51.410 Verification LBA range: start 0x800 length 0x800 00:32:51.410 crypto_ram : 5.02 5459.57 21.33 0.00 0.00 23347.78 1866.35 30773.43 00:32:51.410 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:51.410 Verification LBA range: start 0x0 length 0x800 00:32:51.410 crypto_ram3 : 5.02 2727.27 10.65 0.00 0.00 46638.25 7408.42 35332.45 00:32:51.410 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:51.410 Verification LBA range: start 0x800 length 0x800 00:32:51.410 crypto_ram3 : 5.03 2747.78 10.73 0.00 0.00 46285.85 1923.34 35788.35 00:32:51.410 =================================================================================================================== 00:32:51.410 Total : 16391.74 64.03 0.00 0.00 31078.80 1617.03 35788.35 00:32:51.410 00:32:51.410 real 0m5.798s 00:32:51.410 user 0m10.927s 00:32:51.410 sys 0m0.227s 00:32:51.410 10:59:26 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:51.410 10:59:26 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:51.410 ************************************ 00:32:51.410 END TEST bdev_verify 00:32:51.410 ************************************ 00:32:51.410 10:59:26 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:51.410 10:59:26 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:51.410 10:59:26 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:51.410 10:59:26 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:51.410 10:59:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:51.410 ************************************ 00:32:51.410 START TEST bdev_verify_big_io 00:32:51.410 ************************************ 00:32:51.410 10:59:26 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:51.410 [2024-07-12 10:59:26.559413] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:51.410 [2024-07-12 10:59:26.559472] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2209576 ] 00:32:51.669 [2024-07-12 10:59:26.685999] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:51.669 [2024-07-12 10:59:26.784347] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:51.669 [2024-07-12 10:59:26.784353] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:51.929 [2024-07-12 10:59:26.948757] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:51.929 [2024-07-12 10:59:26.948821] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:51.929 [2024-07-12 10:59:26.948836] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:51.929 [2024-07-12 10:59:26.956776] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:51.929 [2024-07-12 10:59:26.956796] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:51.929 [2024-07-12 10:59:26.956807] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:51.929 [2024-07-12 10:59:26.964801] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:51.929 [2024-07-12 10:59:26.964820] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:51.929 [2024-07-12 10:59:26.964832] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:51.929 Running I/O for 5 seconds... 00:32:57.202 00:32:57.202 Latency(us) 00:32:57.202 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:57.202 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:57.202 Verification LBA range: start 0x0 length 0x80 00:32:57.202 crypto_ram : 5.17 421.18 26.32 0.00 0.00 296417.01 6382.64 403017.91 00:32:57.202 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:57.202 Verification LBA range: start 0x80 length 0x80 00:32:57.202 crypto_ram : 5.15 422.64 26.42 0.00 0.00 295528.66 6097.70 401194.30 00:32:57.202 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:57.202 Verification LBA range: start 0x0 length 0x80 00:32:57.202 crypto_ram3 : 5.35 239.22 14.95 0.00 0.00 501155.46 5784.26 415783.18 00:32:57.202 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:57.202 Verification LBA range: start 0x80 length 0x80 00:32:57.202 crypto_ram3 : 5.33 239.95 15.00 0.00 0.00 499830.36 5641.79 413959.57 00:32:57.202 =================================================================================================================== 00:32:57.202 Total : 1323.00 82.69 0.00 0.00 371721.01 5641.79 415783.18 00:32:57.461 00:32:57.461 real 0m6.114s 00:32:57.461 user 0m11.539s 00:32:57.461 sys 0m0.233s 00:32:57.461 10:59:32 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:57.461 10:59:32 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:32:57.461 ************************************ 00:32:57.461 END TEST bdev_verify_big_io 00:32:57.461 ************************************ 00:32:57.461 10:59:32 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:57.461 10:59:32 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:57.461 10:59:32 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:57.720 10:59:32 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:57.720 10:59:32 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:57.720 ************************************ 00:32:57.720 START TEST bdev_write_zeroes 00:32:57.720 ************************************ 00:32:57.720 10:59:32 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:57.720 [2024-07-12 10:59:32.749729] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:57.720 [2024-07-12 10:59:32.749789] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210453 ] 00:32:57.720 [2024-07-12 10:59:32.876031] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:57.987 [2024-07-12 10:59:32.977396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:57.987 [2024-07-12 10:59:33.155313] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:57.987 [2024-07-12 10:59:33.155385] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:57.987 [2024-07-12 10:59:33.155399] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.987 [2024-07-12 10:59:33.163332] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:57.987 [2024-07-12 10:59:33.163352] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:57.987 [2024-07-12 10:59:33.163364] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.987 [2024-07-12 10:59:33.171353] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:57.987 [2024-07-12 10:59:33.171372] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:57.987 [2024-07-12 10:59:33.171383] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:58.249 Running I/O for 1 seconds... 00:32:59.184 00:32:59.184 Latency(us) 00:32:59.184 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:59.184 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:59.184 crypto_ram : 1.01 26467.63 103.39 0.00 0.00 4823.96 1289.35 6610.59 00:32:59.184 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:59.184 crypto_ram3 : 1.01 13278.82 51.87 0.00 0.00 9564.26 1645.52 9915.88 00:32:59.184 =================================================================================================================== 00:32:59.184 Total : 39746.45 155.26 0.00 0.00 6414.16 1289.35 9915.88 00:32:59.443 00:32:59.443 real 0m1.760s 00:32:59.443 user 0m1.507s 00:32:59.443 sys 0m0.232s 00:32:59.443 10:59:34 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:59.443 10:59:34 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:32:59.443 ************************************ 00:32:59.443 END TEST bdev_write_zeroes 00:32:59.443 ************************************ 00:32:59.443 10:59:34 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:59.443 10:59:34 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.443 10:59:34 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:59.443 10:59:34 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:59.443 10:59:34 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:59.443 ************************************ 00:32:59.443 START TEST bdev_json_nonenclosed 00:32:59.443 ************************************ 00:32:59.443 10:59:34 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.443 [2024-07-12 10:59:34.579621] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:59.443 [2024-07-12 10:59:34.579680] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210655 ] 00:32:59.702 [2024-07-12 10:59:34.707904] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:59.702 [2024-07-12 10:59:34.804372] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:59.702 [2024-07-12 10:59:34.804440] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:32:59.702 [2024-07-12 10:59:34.804461] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:59.702 [2024-07-12 10:59:34.804474] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:59.961 00:32:59.961 real 0m0.385s 00:32:59.961 user 0m0.240s 00:32:59.961 sys 0m0.143s 00:32:59.961 10:59:34 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:32:59.961 10:59:34 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:59.961 10:59:34 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:32:59.961 ************************************ 00:32:59.961 END TEST bdev_json_nonenclosed 00:32:59.961 ************************************ 00:32:59.961 10:59:34 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:32:59.961 10:59:34 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:32:59.961 10:59:34 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.961 10:59:34 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:59.961 10:59:34 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:59.961 10:59:34 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:59.961 ************************************ 00:32:59.961 START TEST bdev_json_nonarray 00:32:59.961 ************************************ 00:32:59.961 10:59:34 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.961 [2024-07-12 10:59:35.038310] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:32:59.961 [2024-07-12 10:59:35.038374] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210677 ] 00:33:00.219 [2024-07-12 10:59:35.170411] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:00.219 [2024-07-12 10:59:35.271958] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:00.219 [2024-07-12 10:59:35.272037] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:00.219 [2024-07-12 10:59:35.272058] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:00.219 [2024-07-12 10:59:35.272070] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:00.219 00:33:00.219 real 0m0.403s 00:33:00.219 user 0m0.239s 00:33:00.219 sys 0m0.161s 00:33:00.219 10:59:35 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:00.219 10:59:35 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:00.219 10:59:35 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:00.219 ************************************ 00:33:00.219 END TEST bdev_json_nonarray 00:33:00.219 ************************************ 00:33:00.478 10:59:35 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:00.478 10:59:35 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:33:00.478 10:59:35 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:33:00.478 10:59:35 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:33:00.478 10:59:35 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:33:00.478 10:59:35 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:33:00.478 10:59:35 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:33:00.478 10:59:35 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:00.478 10:59:35 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:00.478 ************************************ 00:33:00.478 START TEST bdev_crypto_enomem 00:33:00.478 ************************************ 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=2210864 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 2210864 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 2210864 ']' 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:00.478 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:00.478 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:00.478 [2024-07-12 10:59:35.503507] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:33:00.478 [2024-07-12 10:59:35.503573] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210864 ] 00:33:00.478 [2024-07-12 10:59:35.622921] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:00.737 [2024-07-12 10:59:35.734539] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:00.995 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:00.995 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:33:00.995 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:33:00.995 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:00.995 10:59:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:00.995 true 00:33:00.995 base0 00:33:00.995 true 00:33:00.995 [2024-07-12 10:59:36.002707] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:00.995 crypt0 00:33:00.995 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:00.995 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:33:00.995 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:33:00.995 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:00.995 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:33:00.995 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:00.996 [ 00:33:00.996 { 00:33:00.996 "name": "crypt0", 00:33:00.996 "aliases": [ 00:33:00.996 "cbb2b81c-a453-5637-a593-60b8ffe3e9d3" 00:33:00.996 ], 00:33:00.996 "product_name": "crypto", 00:33:00.996 "block_size": 512, 00:33:00.996 "num_blocks": 2097152, 00:33:00.996 "uuid": "cbb2b81c-a453-5637-a593-60b8ffe3e9d3", 00:33:00.996 "assigned_rate_limits": { 00:33:00.996 "rw_ios_per_sec": 0, 00:33:00.996 "rw_mbytes_per_sec": 0, 00:33:00.996 "r_mbytes_per_sec": 0, 00:33:00.996 "w_mbytes_per_sec": 0 00:33:00.996 }, 00:33:00.996 "claimed": false, 00:33:00.996 "zoned": false, 00:33:00.996 "supported_io_types": { 00:33:00.996 "read": true, 00:33:00.996 "write": true, 00:33:00.996 "unmap": false, 00:33:00.996 "flush": false, 00:33:00.996 "reset": true, 00:33:00.996 "nvme_admin": false, 00:33:00.996 "nvme_io": false, 00:33:00.996 "nvme_io_md": false, 00:33:00.996 "write_zeroes": true, 00:33:00.996 "zcopy": false, 00:33:00.996 "get_zone_info": false, 00:33:00.996 "zone_management": false, 00:33:00.996 "zone_append": false, 00:33:00.996 "compare": false, 00:33:00.996 "compare_and_write": false, 00:33:00.996 "abort": false, 00:33:00.996 "seek_hole": false, 00:33:00.996 "seek_data": false, 00:33:00.996 "copy": false, 00:33:00.996 "nvme_iov_md": false 00:33:00.996 }, 00:33:00.996 "memory_domains": [ 00:33:00.996 { 00:33:00.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:00.996 "dma_device_type": 2 00:33:00.996 } 00:33:00.996 ], 00:33:00.996 "driver_specific": { 00:33:00.996 "crypto": { 00:33:00.996 "base_bdev_name": "EE_base0", 00:33:00.996 "name": "crypt0", 00:33:00.996 "key_name": "test_dek_sw" 00:33:00.996 } 00:33:00.996 } 00:33:00.996 } 00:33:00.996 ] 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=2210871 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:33:00.996 10:59:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:00.996 Running I/O for 5 seconds... 00:33:01.930 10:59:37 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:33:01.930 10:59:37 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:01.930 10:59:37 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:01.930 10:59:37 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:01.930 10:59:37 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 2210871 00:33:06.115 00:33:06.115 Latency(us) 00:33:06.115 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:06.115 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:33:06.115 crypt0 : 5.00 36108.27 141.05 0.00 0.00 882.43 418.50 1909.09 00:33:06.115 =================================================================================================================== 00:33:06.115 Total : 36108.27 141.05 0.00 0.00 882.43 418.50 1909.09 00:33:06.115 0 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 2210864 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 2210864 ']' 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 2210864 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2210864 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2210864' 00:33:06.115 killing process with pid 2210864 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 2210864 00:33:06.115 Received shutdown signal, test time was about 5.000000 seconds 00:33:06.115 00:33:06.115 Latency(us) 00:33:06.115 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:06.115 =================================================================================================================== 00:33:06.115 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:06.115 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 2210864 00:33:06.407 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:33:06.407 00:33:06.407 real 0m5.958s 00:33:06.407 user 0m6.244s 00:33:06.407 sys 0m0.363s 00:33:06.407 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:06.407 10:59:41 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:06.407 ************************************ 00:33:06.407 END TEST bdev_crypto_enomem 00:33:06.407 ************************************ 00:33:06.407 10:59:41 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:06.407 10:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:06.407 10:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:33:06.407 10:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:06.407 10:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:06.407 10:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:33:06.407 10:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:33:06.407 10:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:33:06.407 10:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:33:06.407 00:33:06.407 real 0m53.709s 00:33:06.407 user 1m32.777s 00:33:06.407 sys 0m6.355s 00:33:06.407 10:59:41 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:06.407 10:59:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.407 ************************************ 00:33:06.407 END TEST blockdev_crypto_sw 00:33:06.407 ************************************ 00:33:06.407 10:59:41 -- common/autotest_common.sh@1142 -- # return 0 00:33:06.407 10:59:41 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:33:06.407 10:59:41 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:06.407 10:59:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:06.407 10:59:41 -- common/autotest_common.sh@10 -- # set +x 00:33:06.407 ************************************ 00:33:06.407 START TEST blockdev_crypto_qat 00:33:06.407 ************************************ 00:33:06.407 10:59:41 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:33:06.665 * Looking for test storage... 00:33:06.665 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:33:06.665 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2211637 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:06.666 10:59:41 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 2211637 00:33:06.666 10:59:41 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 2211637 ']' 00:33:06.666 10:59:41 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:06.666 10:59:41 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:06.666 10:59:41 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:06.666 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:06.666 10:59:41 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:06.666 10:59:41 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:06.666 [2024-07-12 10:59:41.716950] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:33:06.666 [2024-07-12 10:59:41.717032] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2211637 ] 00:33:06.666 [2024-07-12 10:59:41.843073] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:06.924 [2024-07-12 10:59:41.949886] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:07.489 10:59:42 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:07.489 10:59:42 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:33:07.489 10:59:42 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:33:07.489 10:59:42 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:33:07.489 10:59:42 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:33:07.489 10:59:42 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:07.489 10:59:42 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:07.489 [2024-07-12 10:59:42.644085] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:07.489 [2024-07-12 10:59:42.652120] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:07.489 [2024-07-12 10:59:42.660138] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:07.746 [2024-07-12 10:59:42.734901] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:10.275 true 00:33:10.275 true 00:33:10.275 true 00:33:10.275 true 00:33:10.275 Malloc0 00:33:10.275 Malloc1 00:33:10.275 Malloc2 00:33:10.275 Malloc3 00:33:10.275 [2024-07-12 10:59:45.129270] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:10.275 crypto_ram 00:33:10.275 [2024-07-12 10:59:45.137289] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:10.275 crypto_ram1 00:33:10.275 [2024-07-12 10:59:45.145310] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:10.275 crypto_ram2 00:33:10.275 [2024-07-12 10:59:45.153331] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:10.275 crypto_ram3 00:33:10.275 [ 00:33:10.275 { 00:33:10.275 "name": "Malloc1", 00:33:10.275 "aliases": [ 00:33:10.275 "be98a2f1-e5d3-4db4-b144-85bdaaffe532" 00:33:10.275 ], 00:33:10.275 "product_name": "Malloc disk", 00:33:10.275 "block_size": 512, 00:33:10.275 "num_blocks": 65536, 00:33:10.275 "uuid": "be98a2f1-e5d3-4db4-b144-85bdaaffe532", 00:33:10.275 "assigned_rate_limits": { 00:33:10.275 "rw_ios_per_sec": 0, 00:33:10.275 "rw_mbytes_per_sec": 0, 00:33:10.275 "r_mbytes_per_sec": 0, 00:33:10.275 "w_mbytes_per_sec": 0 00:33:10.275 }, 00:33:10.275 "claimed": true, 00:33:10.275 "claim_type": "exclusive_write", 00:33:10.275 "zoned": false, 00:33:10.275 "supported_io_types": { 00:33:10.275 "read": true, 00:33:10.275 "write": true, 00:33:10.275 "unmap": true, 00:33:10.275 "flush": true, 00:33:10.275 "reset": true, 00:33:10.275 "nvme_admin": false, 00:33:10.275 "nvme_io": false, 00:33:10.275 "nvme_io_md": false, 00:33:10.275 "write_zeroes": true, 00:33:10.275 "zcopy": true, 00:33:10.275 "get_zone_info": false, 00:33:10.275 "zone_management": false, 00:33:10.275 "zone_append": false, 00:33:10.275 "compare": false, 00:33:10.275 "compare_and_write": false, 00:33:10.275 "abort": true, 00:33:10.275 "seek_hole": false, 00:33:10.275 "seek_data": false, 00:33:10.275 "copy": true, 00:33:10.275 "nvme_iov_md": false 00:33:10.275 }, 00:33:10.275 "memory_domains": [ 00:33:10.275 { 00:33:10.275 "dma_device_id": "system", 00:33:10.275 "dma_device_type": 1 00:33:10.275 }, 00:33:10.275 { 00:33:10.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:10.275 "dma_device_type": 2 00:33:10.275 } 00:33:10.275 ], 00:33:10.275 "driver_specific": {} 00:33:10.275 } 00:33:10.275 ] 00:33:10.275 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.275 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:33:10.275 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.275 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:10.275 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.275 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:33:10.275 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:33:10.275 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.275 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:10.275 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.275 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c5c51610-60da-5096-9571-599dcfe9e013"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c5c51610-60da-5096-9571-599dcfe9e013",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "bdc3e136-4c6b-5988-8463-bf6bb42caf9f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bdc3e136-4c6b-5988-8463-bf6bb42caf9f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "dee479e7-6a77-5b96-a65a-49fbd9f6f2b1"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "dee479e7-6a77-5b96-a65a-49fbd9f6f2b1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "c4ff5833-39c7-50d0-8c5d-e197e2ab8276"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c4ff5833-39c7-50d0-8c5d-e197e2ab8276",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:33:10.276 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 2211637 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 2211637 ']' 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 2211637 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2211637 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2211637' 00:33:10.276 killing process with pid 2211637 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 2211637 00:33:10.276 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 2211637 00:33:10.842 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:10.842 10:59:45 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:10.842 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:10.842 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:10.842 10:59:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:10.842 ************************************ 00:33:10.842 START TEST bdev_hello_world 00:33:10.842 ************************************ 00:33:10.842 10:59:45 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:10.842 [2024-07-12 10:59:46.025758] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:33:10.842 [2024-07-12 10:59:46.025819] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2212183 ] 00:33:11.100 [2024-07-12 10:59:46.154089] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:11.100 [2024-07-12 10:59:46.250992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:11.100 [2024-07-12 10:59:46.272266] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:11.100 [2024-07-12 10:59:46.280294] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:11.100 [2024-07-12 10:59:46.288319] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:11.357 [2024-07-12 10:59:46.397890] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:13.887 [2024-07-12 10:59:48.602868] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:13.887 [2024-07-12 10:59:48.602938] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:13.887 [2024-07-12 10:59:48.602954] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.887 [2024-07-12 10:59:48.610886] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:13.887 [2024-07-12 10:59:48.610906] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:13.887 [2024-07-12 10:59:48.610919] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.887 [2024-07-12 10:59:48.618906] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:13.887 [2024-07-12 10:59:48.618925] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:13.887 [2024-07-12 10:59:48.618937] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.887 [2024-07-12 10:59:48.626926] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:13.887 [2024-07-12 10:59:48.626944] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:13.887 [2024-07-12 10:59:48.626955] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.887 [2024-07-12 10:59:48.699780] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:13.887 [2024-07-12 10:59:48.699823] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:13.887 [2024-07-12 10:59:48.699843] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:13.887 [2024-07-12 10:59:48.701105] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:13.887 [2024-07-12 10:59:48.701179] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:13.887 [2024-07-12 10:59:48.701197] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:13.887 [2024-07-12 10:59:48.701240] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:13.887 00:33:13.887 [2024-07-12 10:59:48.701260] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:14.146 00:33:14.146 real 0m3.106s 00:33:14.146 user 0m2.712s 00:33:14.146 sys 0m0.354s 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:14.146 ************************************ 00:33:14.146 END TEST bdev_hello_world 00:33:14.146 ************************************ 00:33:14.146 10:59:49 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:14.146 10:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:14.146 10:59:49 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:14.146 10:59:49 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:14.146 10:59:49 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:14.146 ************************************ 00:33:14.146 START TEST bdev_bounds 00:33:14.146 ************************************ 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2212670 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2212670' 00:33:14.146 Process bdevio pid: 2212670 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2212670 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2212670 ']' 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:14.146 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:14.146 10:59:49 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:14.146 [2024-07-12 10:59:49.222728] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:33:14.146 [2024-07-12 10:59:49.222791] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2212670 ] 00:33:14.405 [2024-07-12 10:59:49.353104] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:14.405 [2024-07-12 10:59:49.461268] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:14.405 [2024-07-12 10:59:49.461353] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:14.405 [2024-07-12 10:59:49.461357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:14.405 [2024-07-12 10:59:49.482707] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:14.405 [2024-07-12 10:59:49.490731] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:14.405 [2024-07-12 10:59:49.498749] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:14.665 [2024-07-12 10:59:49.601630] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:17.202 [2024-07-12 10:59:51.802772] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:17.202 [2024-07-12 10:59:51.802851] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:17.202 [2024-07-12 10:59:51.802865] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:17.202 [2024-07-12 10:59:51.810788] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:17.202 [2024-07-12 10:59:51.810809] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:17.202 [2024-07-12 10:59:51.810822] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:17.202 [2024-07-12 10:59:51.818813] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:17.202 [2024-07-12 10:59:51.818833] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:17.202 [2024-07-12 10:59:51.818844] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:17.202 [2024-07-12 10:59:51.826834] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:17.202 [2024-07-12 10:59:51.826852] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:17.202 [2024-07-12 10:59:51.826864] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:17.202 10:59:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:17.202 10:59:51 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:17.202 10:59:51 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:17.202 I/O targets: 00:33:17.202 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:33:17.202 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:33:17.202 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:33:17.202 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:33:17.202 00:33:17.202 00:33:17.202 CUnit - A unit testing framework for C - Version 2.1-3 00:33:17.202 http://cunit.sourceforge.net/ 00:33:17.202 00:33:17.202 00:33:17.202 Suite: bdevio tests on: crypto_ram3 00:33:17.202 Test: blockdev write read block ...passed 00:33:17.202 Test: blockdev write zeroes read block ...passed 00:33:17.202 Test: blockdev write zeroes read no split ...passed 00:33:17.202 Test: blockdev write zeroes read split ...passed 00:33:17.202 Test: blockdev write zeroes read split partial ...passed 00:33:17.202 Test: blockdev reset ...passed 00:33:17.202 Test: blockdev write read 8 blocks ...passed 00:33:17.202 Test: blockdev write read size > 128k ...passed 00:33:17.202 Test: blockdev write read invalid size ...passed 00:33:17.202 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:17.202 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:17.202 Test: blockdev write read max offset ...passed 00:33:17.202 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:17.202 Test: blockdev writev readv 8 blocks ...passed 00:33:17.202 Test: blockdev writev readv 30 x 1block ...passed 00:33:17.202 Test: blockdev writev readv block ...passed 00:33:17.202 Test: blockdev writev readv size > 128k ...passed 00:33:17.202 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:17.202 Test: blockdev comparev and writev ...passed 00:33:17.202 Test: blockdev nvme passthru rw ...passed 00:33:17.202 Test: blockdev nvme passthru vendor specific ...passed 00:33:17.202 Test: blockdev nvme admin passthru ...passed 00:33:17.202 Test: blockdev copy ...passed 00:33:17.202 Suite: bdevio tests on: crypto_ram2 00:33:17.202 Test: blockdev write read block ...passed 00:33:17.202 Test: blockdev write zeroes read block ...passed 00:33:17.202 Test: blockdev write zeroes read no split ...passed 00:33:17.202 Test: blockdev write zeroes read split ...passed 00:33:17.202 Test: blockdev write zeroes read split partial ...passed 00:33:17.202 Test: blockdev reset ...passed 00:33:17.202 Test: blockdev write read 8 blocks ...passed 00:33:17.202 Test: blockdev write read size > 128k ...passed 00:33:17.202 Test: blockdev write read invalid size ...passed 00:33:17.202 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:17.202 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:17.202 Test: blockdev write read max offset ...passed 00:33:17.202 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:17.202 Test: blockdev writev readv 8 blocks ...passed 00:33:17.202 Test: blockdev writev readv 30 x 1block ...passed 00:33:17.202 Test: blockdev writev readv block ...passed 00:33:17.202 Test: blockdev writev readv size > 128k ...passed 00:33:17.202 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:17.202 Test: blockdev comparev and writev ...passed 00:33:17.202 Test: blockdev nvme passthru rw ...passed 00:33:17.202 Test: blockdev nvme passthru vendor specific ...passed 00:33:17.202 Test: blockdev nvme admin passthru ...passed 00:33:17.202 Test: blockdev copy ...passed 00:33:17.202 Suite: bdevio tests on: crypto_ram1 00:33:17.202 Test: blockdev write read block ...passed 00:33:17.202 Test: blockdev write zeroes read block ...passed 00:33:17.202 Test: blockdev write zeroes read no split ...passed 00:33:17.202 Test: blockdev write zeroes read split ...passed 00:33:17.202 Test: blockdev write zeroes read split partial ...passed 00:33:17.202 Test: blockdev reset ...passed 00:33:17.202 Test: blockdev write read 8 blocks ...passed 00:33:17.202 Test: blockdev write read size > 128k ...passed 00:33:17.202 Test: blockdev write read invalid size ...passed 00:33:17.202 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:17.202 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:17.202 Test: blockdev write read max offset ...passed 00:33:17.202 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:17.202 Test: blockdev writev readv 8 blocks ...passed 00:33:17.202 Test: blockdev writev readv 30 x 1block ...passed 00:33:17.203 Test: blockdev writev readv block ...passed 00:33:17.203 Test: blockdev writev readv size > 128k ...passed 00:33:17.203 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:17.203 Test: blockdev comparev and writev ...passed 00:33:17.203 Test: blockdev nvme passthru rw ...passed 00:33:17.203 Test: blockdev nvme passthru vendor specific ...passed 00:33:17.203 Test: blockdev nvme admin passthru ...passed 00:33:17.203 Test: blockdev copy ...passed 00:33:17.203 Suite: bdevio tests on: crypto_ram 00:33:17.203 Test: blockdev write read block ...passed 00:33:17.203 Test: blockdev write zeroes read block ...passed 00:33:17.203 Test: blockdev write zeroes read no split ...passed 00:33:17.203 Test: blockdev write zeroes read split ...passed 00:33:17.203 Test: blockdev write zeroes read split partial ...passed 00:33:17.203 Test: blockdev reset ...passed 00:33:17.203 Test: blockdev write read 8 blocks ...passed 00:33:17.203 Test: blockdev write read size > 128k ...passed 00:33:17.203 Test: blockdev write read invalid size ...passed 00:33:17.203 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:17.203 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:17.203 Test: blockdev write read max offset ...passed 00:33:17.203 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:17.203 Test: blockdev writev readv 8 blocks ...passed 00:33:17.203 Test: blockdev writev readv 30 x 1block ...passed 00:33:17.203 Test: blockdev writev readv block ...passed 00:33:17.203 Test: blockdev writev readv size > 128k ...passed 00:33:17.203 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:17.203 Test: blockdev comparev and writev ...passed 00:33:17.203 Test: blockdev nvme passthru rw ...passed 00:33:17.203 Test: blockdev nvme passthru vendor specific ...passed 00:33:17.203 Test: blockdev nvme admin passthru ...passed 00:33:17.203 Test: blockdev copy ...passed 00:33:17.203 00:33:17.203 Run Summary: Type Total Ran Passed Failed Inactive 00:33:17.203 suites 4 4 n/a 0 0 00:33:17.203 tests 92 92 92 0 0 00:33:17.203 asserts 520 520 520 0 n/a 00:33:17.203 00:33:17.203 Elapsed time = 0.517 seconds 00:33:17.203 0 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2212670 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2212670 ']' 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2212670 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2212670 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2212670' 00:33:17.203 killing process with pid 2212670 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2212670 00:33:17.203 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2212670 00:33:17.770 10:59:52 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:17.770 00:33:17.770 real 0m3.578s 00:33:17.771 user 0m9.915s 00:33:17.771 sys 0m0.570s 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:17.771 ************************************ 00:33:17.771 END TEST bdev_bounds 00:33:17.771 ************************************ 00:33:17.771 10:59:52 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:17.771 10:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:33:17.771 10:59:52 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:17.771 10:59:52 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:17.771 10:59:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:17.771 ************************************ 00:33:17.771 START TEST bdev_nbd 00:33:17.771 ************************************ 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2213103 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2213103 /var/tmp/spdk-nbd.sock 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2213103 ']' 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:17.771 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:17.771 10:59:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:17.771 [2024-07-12 10:59:52.875165] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:33:17.771 [2024-07-12 10:59:52.875225] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:18.028 [2024-07-12 10:59:53.005914] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:18.028 [2024-07-12 10:59:53.108934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:18.028 [2024-07-12 10:59:53.130285] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:18.028 [2024-07-12 10:59:53.138305] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:18.028 [2024-07-12 10:59:53.146323] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:18.328 [2024-07-12 10:59:53.247141] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:20.859 [2024-07-12 10:59:55.448291] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:20.859 [2024-07-12 10:59:55.448357] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:20.859 [2024-07-12 10:59:55.448371] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.859 [2024-07-12 10:59:55.456311] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:20.859 [2024-07-12 10:59:55.456332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:20.859 [2024-07-12 10:59:55.456344] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.859 [2024-07-12 10:59:55.464332] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:20.859 [2024-07-12 10:59:55.464350] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:20.859 [2024-07-12 10:59:55.464362] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.859 [2024-07-12 10:59:55.472352] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:20.859 [2024-07-12 10:59:55.472370] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:20.859 [2024-07-12 10:59:55.472381] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:20.859 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:20.860 1+0 records in 00:33:20.860 1+0 records out 00:33:20.860 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308793 s, 13.3 MB/s 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:20.860 10:59:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:21.119 1+0 records in 00:33:21.119 1+0 records out 00:33:21.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311846 s, 13.1 MB/s 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:21.119 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:21.379 1+0 records in 00:33:21.379 1+0 records out 00:33:21.379 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000318534 s, 12.9 MB/s 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:21.379 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:21.638 1+0 records in 00:33:21.638 1+0 records out 00:33:21.638 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000332323 s, 12.3 MB/s 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:21.638 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:21.897 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:21.897 { 00:33:21.897 "nbd_device": "/dev/nbd0", 00:33:21.897 "bdev_name": "crypto_ram" 00:33:21.897 }, 00:33:21.897 { 00:33:21.897 "nbd_device": "/dev/nbd1", 00:33:21.897 "bdev_name": "crypto_ram1" 00:33:21.897 }, 00:33:21.897 { 00:33:21.897 "nbd_device": "/dev/nbd2", 00:33:21.897 "bdev_name": "crypto_ram2" 00:33:21.897 }, 00:33:21.897 { 00:33:21.897 "nbd_device": "/dev/nbd3", 00:33:21.897 "bdev_name": "crypto_ram3" 00:33:21.897 } 00:33:21.897 ]' 00:33:21.897 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:21.897 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:21.897 10:59:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:21.897 { 00:33:21.897 "nbd_device": "/dev/nbd0", 00:33:21.897 "bdev_name": "crypto_ram" 00:33:21.897 }, 00:33:21.897 { 00:33:21.897 "nbd_device": "/dev/nbd1", 00:33:21.897 "bdev_name": "crypto_ram1" 00:33:21.897 }, 00:33:21.897 { 00:33:21.897 "nbd_device": "/dev/nbd2", 00:33:21.897 "bdev_name": "crypto_ram2" 00:33:21.897 }, 00:33:21.897 { 00:33:21.897 "nbd_device": "/dev/nbd3", 00:33:21.897 "bdev_name": "crypto_ram3" 00:33:21.897 } 00:33:21.897 ]' 00:33:21.897 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:33:21.897 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:21.898 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:33:21.898 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:21.898 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:21.898 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:21.898 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:22.157 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:22.416 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:33:22.675 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:33:22.675 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:33:22.675 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:33:22.675 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:22.675 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:22.676 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:33:22.676 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:22.676 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:22.676 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:22.676 10:59:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:22.935 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:23.194 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:23.453 /dev/nbd0 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:23.453 1+0 records in 00:33:23.453 1+0 records out 00:33:23.453 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297484 s, 13.8 MB/s 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:23.453 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:33:23.712 /dev/nbd1 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:23.712 1+0 records in 00:33:23.712 1+0 records out 00:33:23.712 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000334385 s, 12.2 MB/s 00:33:23.712 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.713 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:23.713 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.713 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:23.713 10:59:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:23.713 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:23.713 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:23.713 10:59:58 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:33:23.972 /dev/nbd10 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:23.972 1+0 records in 00:33:23.972 1+0 records out 00:33:23.972 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00033093 s, 12.4 MB/s 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:23.972 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:33:24.230 /dev/nbd11 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:24.230 1+0 records in 00:33:24.230 1+0 records out 00:33:24.230 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00036592 s, 11.2 MB/s 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:24.230 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:24.566 { 00:33:24.566 "nbd_device": "/dev/nbd0", 00:33:24.566 "bdev_name": "crypto_ram" 00:33:24.566 }, 00:33:24.566 { 00:33:24.566 "nbd_device": "/dev/nbd1", 00:33:24.566 "bdev_name": "crypto_ram1" 00:33:24.566 }, 00:33:24.566 { 00:33:24.566 "nbd_device": "/dev/nbd10", 00:33:24.566 "bdev_name": "crypto_ram2" 00:33:24.566 }, 00:33:24.566 { 00:33:24.566 "nbd_device": "/dev/nbd11", 00:33:24.566 "bdev_name": "crypto_ram3" 00:33:24.566 } 00:33:24.566 ]' 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:24.566 { 00:33:24.566 "nbd_device": "/dev/nbd0", 00:33:24.566 "bdev_name": "crypto_ram" 00:33:24.566 }, 00:33:24.566 { 00:33:24.566 "nbd_device": "/dev/nbd1", 00:33:24.566 "bdev_name": "crypto_ram1" 00:33:24.566 }, 00:33:24.566 { 00:33:24.566 "nbd_device": "/dev/nbd10", 00:33:24.566 "bdev_name": "crypto_ram2" 00:33:24.566 }, 00:33:24.566 { 00:33:24.566 "nbd_device": "/dev/nbd11", 00:33:24.566 "bdev_name": "crypto_ram3" 00:33:24.566 } 00:33:24.566 ]' 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:24.566 /dev/nbd1 00:33:24.566 /dev/nbd10 00:33:24.566 /dev/nbd11' 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:24.566 /dev/nbd1 00:33:24.566 /dev/nbd10 00:33:24.566 /dev/nbd11' 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:24.566 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:24.566 256+0 records in 00:33:24.566 256+0 records out 00:33:24.566 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010484 s, 100 MB/s 00:33:24.567 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:24.567 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:24.825 256+0 records in 00:33:24.825 256+0 records out 00:33:24.825 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0833807 s, 12.6 MB/s 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:24.825 256+0 records in 00:33:24.825 256+0 records out 00:33:24.825 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0581365 s, 18.0 MB/s 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:33:24.825 256+0 records in 00:33:24.825 256+0 records out 00:33:24.825 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0392536 s, 26.7 MB/s 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:33:24.825 256+0 records in 00:33:24.825 256+0 records out 00:33:24.825 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0554519 s, 18.9 MB/s 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:24.825 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:24.826 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:24.826 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:24.826 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:24.826 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:24.826 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:24.826 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:33:24.826 10:59:59 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:24.826 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:33:24.826 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:24.826 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:24.826 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:24.826 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:24.826 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:24.826 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:24.826 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:24.826 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:25.084 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:25.084 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:25.084 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:25.084 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:25.084 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:25.084 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:25.342 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:25.342 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:25.342 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:25.342 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:25.342 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:33:25.600 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:33:25.858 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:33:25.858 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:33:25.858 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:25.858 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:25.858 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:33:25.858 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:25.858 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:25.858 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:25.858 11:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:26.117 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:26.375 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:26.632 malloc_lvol_verify 00:33:26.632 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:26.892 c3f2c213-587b-4060-aa06-520129793c91 00:33:26.892 11:00:01 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:27.149 53c977dd-163f-4687-b3d3-6ed32eb1f740 00:33:27.149 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:27.149 /dev/nbd0 00:33:27.149 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:27.409 mke2fs 1.46.5 (30-Dec-2021) 00:33:27.409 Discarding device blocks: 0/4096 done 00:33:27.409 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:27.409 00:33:27.409 Allocating group tables: 0/1 done 00:33:27.409 Writing inode tables: 0/1 done 00:33:27.409 Creating journal (1024 blocks): done 00:33:27.409 Writing superblocks and filesystem accounting information: 0/1 done 00:33:27.409 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:27.409 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2213103 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2213103 ']' 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2213103 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2213103 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2213103' 00:33:27.668 killing process with pid 2213103 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2213103 00:33:27.668 11:00:02 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2213103 00:33:27.926 11:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:27.926 00:33:27.926 real 0m10.218s 00:33:27.926 user 0m13.364s 00:33:27.926 sys 0m4.012s 00:33:27.926 11:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:27.926 11:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:27.926 ************************************ 00:33:27.926 END TEST bdev_nbd 00:33:27.926 ************************************ 00:33:27.926 11:00:03 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:27.926 11:00:03 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:27.926 11:00:03 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:33:27.926 11:00:03 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:33:27.927 11:00:03 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:27.927 11:00:03 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:27.927 11:00:03 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:27.927 11:00:03 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:27.927 ************************************ 00:33:27.927 START TEST bdev_fio 00:33:27.927 ************************************ 00:33:27.927 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:27.927 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:27.927 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:27.927 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:27.927 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:27.927 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:27.927 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:28.187 ************************************ 00:33:28.187 START TEST bdev_fio_rw_verify 00:33:28.187 ************************************ 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:28.187 11:00:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.446 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:28.446 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:28.446 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:28.446 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:28.446 fio-3.35 00:33:28.446 Starting 4 threads 00:33:43.344 00:33:43.344 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2215625: Fri Jul 12 11:00:16 2024 00:33:43.344 read: IOPS=18.5k, BW=72.3MiB/s (75.8MB/s)(723MiB/10001msec) 00:33:43.344 slat (usec): min=17, max=483, avg=72.00, stdev=31.65 00:33:43.344 clat (usec): min=29, max=1448, avg=403.06, stdev=226.75 00:33:43.344 lat (usec): min=72, max=1502, avg=475.06, stdev=239.57 00:33:43.344 clat percentiles (usec): 00:33:43.344 | 50.000th=[ 355], 99.000th=[ 996], 99.900th=[ 1156], 99.990th=[ 1287], 00:33:43.344 | 99.999th=[ 1418] 00:33:43.344 write: IOPS=20.4k, BW=79.7MiB/s (83.6MB/s)(778MiB/9754msec); 0 zone resets 00:33:43.344 slat (usec): min=21, max=1582, avg=86.38, stdev=30.30 00:33:43.344 clat (usec): min=31, max=2418, avg=452.62, stdev=240.19 00:33:43.344 lat (usec): min=78, max=2503, avg=538.99, stdev=251.80 00:33:43.344 clat percentiles (usec): 00:33:43.344 | 50.000th=[ 412], 99.000th=[ 1037], 99.900th=[ 1172], 99.990th=[ 1647], 00:33:43.344 | 99.999th=[ 2278] 00:33:43.344 bw ( KiB/s): min=63272, max=106216, per=97.73%, avg=79788.21, stdev=3028.43, samples=76 00:33:43.344 iops : min=15818, max=26554, avg=19947.05, stdev=757.11, samples=76 00:33:43.344 lat (usec) : 50=0.01%, 100=1.43%, 250=26.30%, 500=38.08%, 750=22.59% 00:33:43.344 lat (usec) : 1000=10.28% 00:33:43.344 lat (msec) : 2=1.31%, 4=0.01% 00:33:43.344 cpu : usr=99.52%, sys=0.01%, ctx=75, majf=0, minf=233 00:33:43.344 IO depths : 1=6.1%, 2=26.9%, 4=53.7%, 8=13.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:43.344 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:43.344 complete : 0=0.0%, 4=88.2%, 8=11.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:43.344 issued rwts: total=185038,199080,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:43.344 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:43.344 00:33:43.344 Run status group 0 (all jobs): 00:33:43.344 READ: bw=72.3MiB/s (75.8MB/s), 72.3MiB/s-72.3MiB/s (75.8MB/s-75.8MB/s), io=723MiB (758MB), run=10001-10001msec 00:33:43.344 WRITE: bw=79.7MiB/s (83.6MB/s), 79.7MiB/s-79.7MiB/s (83.6MB/s-83.6MB/s), io=778MiB (815MB), run=9754-9754msec 00:33:43.344 00:33:43.344 real 0m13.486s 00:33:43.344 user 0m45.728s 00:33:43.344 sys 0m0.519s 00:33:43.344 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:43.344 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:43.344 ************************************ 00:33:43.344 END TEST bdev_fio_rw_verify 00:33:43.344 ************************************ 00:33:43.344 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:43.344 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:43.344 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:43.344 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:43.344 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c5c51610-60da-5096-9571-599dcfe9e013"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c5c51610-60da-5096-9571-599dcfe9e013",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "bdc3e136-4c6b-5988-8463-bf6bb42caf9f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bdc3e136-4c6b-5988-8463-bf6bb42caf9f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "dee479e7-6a77-5b96-a65a-49fbd9f6f2b1"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "dee479e7-6a77-5b96-a65a-49fbd9f6f2b1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "c4ff5833-39c7-50d0-8c5d-e197e2ab8276"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c4ff5833-39c7-50d0-8c5d-e197e2ab8276",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:43.345 crypto_ram1 00:33:43.345 crypto_ram2 00:33:43.345 crypto_ram3 ]] 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c5c51610-60da-5096-9571-599dcfe9e013"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c5c51610-60da-5096-9571-599dcfe9e013",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "bdc3e136-4c6b-5988-8463-bf6bb42caf9f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bdc3e136-4c6b-5988-8463-bf6bb42caf9f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "dee479e7-6a77-5b96-a65a-49fbd9f6f2b1"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "dee479e7-6a77-5b96-a65a-49fbd9f6f2b1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "c4ff5833-39c7-50d0-8c5d-e197e2ab8276"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c4ff5833-39c7-50d0-8c5d-e197e2ab8276",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:43.345 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:43.346 ************************************ 00:33:43.346 START TEST bdev_fio_trim 00:33:43.346 ************************************ 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:43.346 11:00:16 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:43.346 11:00:17 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:43.346 11:00:17 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:43.346 11:00:17 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:43.346 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:43.346 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:43.346 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:43.346 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:43.346 fio-3.35 00:33:43.346 Starting 4 threads 00:33:55.553 00:33:55.553 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2217487: Fri Jul 12 11:00:30 2024 00:33:55.553 write: IOPS=34.0k, BW=133MiB/s (139MB/s)(1329MiB/10001msec); 0 zone resets 00:33:55.553 slat (usec): min=19, max=409, avg=66.15, stdev=24.20 00:33:55.553 clat (usec): min=45, max=1498, avg=244.33, stdev=108.56 00:33:55.553 lat (usec): min=75, max=1569, avg=310.48, stdev=117.99 00:33:55.553 clat percentiles (usec): 00:33:55.553 | 50.000th=[ 237], 99.000th=[ 494], 99.900th=[ 693], 99.990th=[ 791], 00:33:55.553 | 99.999th=[ 1254] 00:33:55.553 bw ( KiB/s): min=121216, max=176608, per=100.00%, avg=136196.84, stdev=3403.33, samples=76 00:33:55.553 iops : min=30304, max=44152, avg=34049.05, stdev=850.84, samples=76 00:33:55.553 trim: IOPS=34.0k, BW=133MiB/s (139MB/s)(1329MiB/10001msec); 0 zone resets 00:33:55.554 slat (usec): min=6, max=350, avg=20.55, stdev= 8.51 00:33:55.554 clat (usec): min=75, max=1569, avg=310.68, stdev=118.01 00:33:55.554 lat (usec): min=84, max=1588, avg=331.22, stdev=119.74 00:33:55.554 clat percentiles (usec): 00:33:55.554 | 50.000th=[ 306], 99.000th=[ 578], 99.900th=[ 840], 99.990th=[ 971], 00:33:55.554 | 99.999th=[ 1385] 00:33:55.554 bw ( KiB/s): min=121216, max=176608, per=100.00%, avg=136196.84, stdev=3403.33, samples=76 00:33:55.554 iops : min=30304, max=44152, avg=34049.05, stdev=850.84, samples=76 00:33:55.554 lat (usec) : 50=0.05%, 100=3.86%, 250=40.87%, 500=51.86%, 750=3.28% 00:33:55.554 lat (usec) : 1000=0.08% 00:33:55.554 lat (msec) : 2=0.01% 00:33:55.554 cpu : usr=99.59%, sys=0.00%, ctx=57, majf=0, minf=102 00:33:55.554 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:55.554 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:55.554 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:55.554 issued rwts: total=0,340100,340101,0 short=0,0,0,0 dropped=0,0,0,0 00:33:55.554 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:55.554 00:33:55.554 Run status group 0 (all jobs): 00:33:55.554 WRITE: bw=133MiB/s (139MB/s), 133MiB/s-133MiB/s (139MB/s-139MB/s), io=1329MiB (1393MB), run=10001-10001msec 00:33:55.554 TRIM: bw=133MiB/s (139MB/s), 133MiB/s-133MiB/s (139MB/s-139MB/s), io=1329MiB (1393MB), run=10001-10001msec 00:33:55.554 00:33:55.554 real 0m13.480s 00:33:55.554 user 0m45.693s 00:33:55.554 sys 0m0.474s 00:33:55.554 11:00:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:55.554 11:00:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:55.554 ************************************ 00:33:55.554 END TEST bdev_fio_trim 00:33:55.554 ************************************ 00:33:55.554 11:00:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:55.554 11:00:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:55.554 11:00:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:55.554 11:00:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:55.554 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:55.555 11:00:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:55.555 00:33:55.555 real 0m27.333s 00:33:55.555 user 1m31.606s 00:33:55.555 sys 0m1.195s 00:33:55.555 11:00:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:55.555 11:00:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:55.555 ************************************ 00:33:55.555 END TEST bdev_fio 00:33:55.555 ************************************ 00:33:55.555 11:00:30 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:55.555 11:00:30 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:55.555 11:00:30 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:55.555 11:00:30 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:55.555 11:00:30 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:55.555 11:00:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:55.555 ************************************ 00:33:55.555 START TEST bdev_verify 00:33:55.555 ************************************ 00:33:55.555 11:00:30 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:55.555 [2024-07-12 11:00:30.588586] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:33:55.555 [2024-07-12 11:00:30.588638] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2218861 ] 00:33:55.555 [2024-07-12 11:00:30.702173] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:55.818 [2024-07-12 11:00:30.811439] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:55.818 [2024-07-12 11:00:30.811443] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:55.818 [2024-07-12 11:00:30.832834] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:55.818 [2024-07-12 11:00:30.840866] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:55.818 [2024-07-12 11:00:30.848891] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:55.818 [2024-07-12 11:00:30.956691] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:58.356 [2024-07-12 11:00:33.175252] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:58.357 [2024-07-12 11:00:33.175337] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:58.357 [2024-07-12 11:00:33.175351] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:58.357 [2024-07-12 11:00:33.183273] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:58.357 [2024-07-12 11:00:33.183294] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:58.357 [2024-07-12 11:00:33.183306] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:58.357 [2024-07-12 11:00:33.191297] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:58.357 [2024-07-12 11:00:33.191316] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:58.357 [2024-07-12 11:00:33.191334] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:58.357 [2024-07-12 11:00:33.199320] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:58.357 [2024-07-12 11:00:33.199339] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:58.357 [2024-07-12 11:00:33.199351] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:58.357 Running I/O for 5 seconds... 00:34:03.673 00:34:03.673 Latency(us) 00:34:03.673 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:03.673 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:03.673 Verification LBA range: start 0x0 length 0x1000 00:34:03.673 crypto_ram : 5.07 487.94 1.91 0.00 0.00 260985.87 3077.34 179625.63 00:34:03.673 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:03.673 Verification LBA range: start 0x1000 length 0x1000 00:34:03.673 crypto_ram : 5.07 495.44 1.94 0.00 0.00 257205.76 3362.28 178713.82 00:34:03.673 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:03.673 Verification LBA range: start 0x0 length 0x1000 00:34:03.673 crypto_ram1 : 5.07 492.45 1.92 0.00 0.00 258340.00 2721.17 165948.55 00:34:03.673 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:03.673 Verification LBA range: start 0x1000 length 0x1000 00:34:03.673 crypto_ram1 : 5.07 498.42 1.95 0.00 0.00 255310.08 3476.26 165036.74 00:34:03.673 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:03.673 Verification LBA range: start 0x0 length 0x1000 00:34:03.673 crypto_ram2 : 5.05 3826.83 14.95 0.00 0.00 33172.34 5527.82 29405.72 00:34:03.673 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:03.673 Verification LBA range: start 0x1000 length 0x1000 00:34:03.673 crypto_ram2 : 5.05 3852.43 15.05 0.00 0.00 32951.33 5584.81 29405.72 00:34:03.673 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:03.673 Verification LBA range: start 0x0 length 0x1000 00:34:03.673 crypto_ram3 : 5.05 3824.39 14.94 0.00 0.00 33106.52 6382.64 29633.67 00:34:03.673 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:03.673 Verification LBA range: start 0x1000 length 0x1000 00:34:03.673 crypto_ram3 : 5.05 3849.98 15.04 0.00 0.00 32885.27 6382.64 29405.72 00:34:03.673 =================================================================================================================== 00:34:03.673 Total : 17327.88 67.69 0.00 0.00 58731.31 2721.17 179625.63 00:34:03.673 00:34:03.673 real 0m8.261s 00:34:03.673 user 0m15.658s 00:34:03.673 sys 0m0.379s 00:34:03.673 11:00:38 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:03.673 11:00:38 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:03.673 ************************************ 00:34:03.673 END TEST bdev_verify 00:34:03.673 ************************************ 00:34:03.673 11:00:38 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:03.673 11:00:38 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:03.673 11:00:38 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:03.673 11:00:38 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:03.673 11:00:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.944 ************************************ 00:34:03.944 START TEST bdev_verify_big_io 00:34:03.944 ************************************ 00:34:03.944 11:00:38 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:03.944 [2024-07-12 11:00:38.936318] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:03.944 [2024-07-12 11:00:38.936373] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2219919 ] 00:34:03.944 [2024-07-12 11:00:39.052210] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:04.204 [2024-07-12 11:00:39.155211] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:04.204 [2024-07-12 11:00:39.155216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:04.204 [2024-07-12 11:00:39.176601] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:04.204 [2024-07-12 11:00:39.184632] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:04.204 [2024-07-12 11:00:39.192658] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:04.204 [2024-07-12 11:00:39.298341] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:06.737 [2024-07-12 11:00:41.506655] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:06.737 [2024-07-12 11:00:41.506751] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:06.737 [2024-07-12 11:00:41.506768] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:06.737 [2024-07-12 11:00:41.514673] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:06.737 [2024-07-12 11:00:41.514695] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:06.737 [2024-07-12 11:00:41.514707] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:06.737 [2024-07-12 11:00:41.522696] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:06.737 [2024-07-12 11:00:41.522716] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:06.737 [2024-07-12 11:00:41.522728] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:06.737 [2024-07-12 11:00:41.530718] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:06.737 [2024-07-12 11:00:41.530737] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:06.737 [2024-07-12 11:00:41.530748] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:06.737 Running I/O for 5 seconds... 00:34:07.304 [2024-07-12 11:00:42.473750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.474232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.474310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.474359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.474401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.474444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.474875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.474894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.478383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.478436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.478488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.478530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.479014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.479061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.479103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.479145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.479573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.479593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.483009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.483057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.483101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.483143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.483615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.483662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.483705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.483746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.484122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.484140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.487643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.487707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.487748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.487789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.488258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.488304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.488363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.488404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.488865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.488884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.492179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.492227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.492268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.492318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.492746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.492791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.492833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.492876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.493306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.493325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.496662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.496723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.496763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.496805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.497315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.497362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.304 [2024-07-12 11:00:42.497404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.497447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.497880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.497899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.500929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.500975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.501018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.501059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.501530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.501575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.501617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.501661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.502060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.502078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.505360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.505407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.505448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.505500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.505971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.506017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.506071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.506112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.506559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.506577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.509894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.509941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.509982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.510027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.510416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.510461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.510509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.510551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.510904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.510921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.514530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.514585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.514641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.514695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.515187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.515245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.515287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.515330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.515740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.515759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.519149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.519197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.519238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.519293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.519703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.519754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.519796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.519837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.520278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.520297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.523413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.523461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.523507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.523549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.524004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.565 [2024-07-12 11:00:42.524049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.524092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.524135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.524562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.524581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.527648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.527695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.527736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.527780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.528252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.528298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.528340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.528381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.528816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.528836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.531985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.532032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.532074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.532116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.532595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.532642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.532684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.532726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.533166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.533184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.536206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.536252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.536293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.536335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.536758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.536803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.536845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.536886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.537313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.537332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.540304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.540351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.540392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.540455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.540968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.541016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.541058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.541100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.541523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.541542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.544505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.544551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.544592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.544635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.545100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.545151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.545194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.545236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.545647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.545666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.548766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.548813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.548854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.548896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.549377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.549429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.549487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.549534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.549944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.549963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.553093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.553140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.553182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.553223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.553622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.553669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.553711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.553752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.554143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.554161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.557388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.557437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.557503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.557559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.558024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.558087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.558130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.558172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.558626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.558645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.561690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.561735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.561785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.561827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.562304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.562361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.562404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.562445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.562884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.562906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.565772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.565819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.565859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.565901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.566385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.566432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.566474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.566521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.566960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.566979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.569768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.569814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.569855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.569896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.570351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.570398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.570449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.570496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.570875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.570894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.573941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.573988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.574031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.574072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.574542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.574588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.574630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.574672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.575159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.575178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.577965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.578011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.578055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.578097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.578502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.578547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.578589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.578630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.579061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.579083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.582000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.582046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.582088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.582129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.582631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.582677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.582719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.582765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.583157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.583176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.586033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.586080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.586121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.586164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.586627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.586673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.586716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.586758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.587147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.587165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.590160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.590217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.566 [2024-07-12 11:00:42.590259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.590302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.590692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.590738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.590781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.590822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.591199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.591217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.594140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.594189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.594244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.594311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.594794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.594851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.594892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.594939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.595356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.595375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.598891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.600763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.600817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.600857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.600898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.601206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.601252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.601294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.601337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.601826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.601846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.604352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.604404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.604464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.604502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.604798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.604861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.604903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.604944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.605217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.605236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.608620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.609024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.609414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.609807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.610641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.612337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.613884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.615516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.615794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.615812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.618620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.619028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.619419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.619821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.621345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.622646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.624177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.625712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.626052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.626070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.628153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.628546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.628937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.629326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.631153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.632754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.634288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.635755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.636100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.636118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.638336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.638733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.639124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.639518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.641078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.642604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.644124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.644864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.645141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.645160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.647527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.647920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.648310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.649647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.651528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.653061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.654090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.655696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.656025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.656042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.658552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.658944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.659670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.660963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.662877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.664615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.665590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.666873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.667148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.667170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.669843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.670233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.671819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.673237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.675055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.675837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.677359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.679064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.679344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.679361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.681992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.682900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.684207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.685785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.687339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.688746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.690034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.691559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.691838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.691856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.694787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.696284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.697940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.699519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.700503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.701795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.703319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.704846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.705123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.705147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.708884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.710182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.711686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.713215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.714987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.716268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.567 [2024-07-12 11:00:42.717781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.719298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.719692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.719711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.724414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.726063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.727792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.729421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.731089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.732623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.734148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.735511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.735892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.735910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.739617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.741156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.742691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.743662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.745243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.746772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.748299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.748961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.749425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.749444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.753274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.568 [2024-07-12 11:00:42.754879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.756575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.757513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.759391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.760928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.762336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.762724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.763162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.763183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.766885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.768418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.769475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.771061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.772920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.774442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.775261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.775656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.776079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.776100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.779847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.781378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.782153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.783451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.785395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.786983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.787372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.787767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.788175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.788193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.791711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.792847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.794398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.829 [2024-07-12 11:00:42.795765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.797586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.798338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.798729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.799121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.799585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.799605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.802949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.803774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.805072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.806606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.808426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.808819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.809206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.809595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.810040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.810061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.812867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.814421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.815787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.817299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.818442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.818840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.819229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.819621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.820059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.820078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.822411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.823712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.825240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.826755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.827422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.827815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.828205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.828601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.828906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.828924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.832195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.833644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.835181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.836782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.837588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.837979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.838369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.839063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.839342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.839360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.842314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.843841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.845386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.846304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.847125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.847523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.847912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.849555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.849866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.849885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.853166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.854896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.856515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.856905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.857715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.858106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.858982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.860281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.860562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.860580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.863714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.865232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.866048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.866440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.867261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.867657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.869288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.870880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.871156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.871174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.874403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.876015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.876409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.876802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.877595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.878560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.879853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.881360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.881642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.881660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.884867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.885651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.886048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.886437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.887270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.888806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.890493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.892144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.892420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.892438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.830 [2024-07-12 11:00:42.895593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.895986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.896374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.896771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.898389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.899670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.901175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.902709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.903096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.903117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.905305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.905704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.906095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.906488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.908236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.909765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.911294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.912686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.913006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.913025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.914986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.915382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.915786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.916184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.917719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.919225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.920734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.921551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.921831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.921850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.923838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.924230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.924623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.925455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.927496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.929206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.929841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.931124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.931400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.931418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.933938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.934336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.934741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.935134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.935959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.936350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.936753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.937148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.937582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.937601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.940300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.940716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.941108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.941156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.942003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.942401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.942806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.943201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.943616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.943635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.946397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.946796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.947188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.947586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.947637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.948007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.948412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.948807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.949194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.949591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.949941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.949960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.952209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.952260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.952304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.952345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.952785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.952840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.952894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.952938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.953001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.953387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.953405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.955839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.955890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.955931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.955973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.956345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.956409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.956451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.956502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.956545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.956889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.956908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.959181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.959230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.959272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.959325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.831 [2024-07-12 11:00:42.959668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.959733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.959788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.959833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.959897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.960399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.960417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.962958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.963028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.963081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.963124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.963506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.963572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.963616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.963659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.963702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.964142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.964161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.966489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.966536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.966578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.966631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.966998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.967066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.967113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.967155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.967197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.967636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.967656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.969916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.969977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.970028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.970070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.970447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.970506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.970549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.970592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.970633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.971082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.971100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.973489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.973536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.973590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.973633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.974120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.974175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.974218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.974267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.974310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.974713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.974732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.977083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.977129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.977175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.977216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.977636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.977691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.977734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.977777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.977820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.978253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.978270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.980446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.980501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.980544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.980586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.981007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.981064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.981121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.981163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.981205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.981649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.981668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.983954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.984001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.984043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.984084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.984492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.984551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.984594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.984636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.984678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.985098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.985117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.987399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.987446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.987494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.987553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.988003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.988058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.988101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.988143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.988186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.988603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.988623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.990959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.991005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.991050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.991093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.832 [2024-07-12 11:00:42.991515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.991570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.991614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.991656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.991698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.992035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.992053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.994284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.994331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.994380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.994423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.994859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.994917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.994990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.995033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.995099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.995492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.995511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.997886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.997933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.997976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.998018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.998402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.998489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.998536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.998578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.998620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.998948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:42.998967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.001254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.001303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.001346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.001397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.001796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.001861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.001917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.001960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.002015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.002386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.002405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.004905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.004976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.005034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.005077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.005486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.005553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.005598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.005640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.005682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.006103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.006122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.008456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.008509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.008553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.008605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.008949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.009017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.009062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.009104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.009146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.009579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.009599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.011954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.012016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.012067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.012110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.012497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.012552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.012595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.012638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.012685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.013123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.013142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.015281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.015328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.015371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.015415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.015869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.015928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.015970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.016011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.016053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.833 [2024-07-12 11:00:43.016446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.016463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.018771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.018817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.018863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.018905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.019333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.019387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.019430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.019471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.019520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.019949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:07.834 [2024-07-12 11:00:43.019966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.022192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.022239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.022282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.022324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.022774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.022831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.022893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.022935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.022976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.023424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.023441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.025701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.025747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.025790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.025834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.026236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.026287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.026328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.026369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.026413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.026851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.026870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.029119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.029167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.029209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.029268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.029733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.029787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.029829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.029872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.029918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.030313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.030329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.032205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.032251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.032296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.032345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.032781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.032834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.032877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.032920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.032962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.033408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.033425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.035211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.035255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.035296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.035337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.035611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.035673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.035715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.035763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.035804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.036126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.036141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.037718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.037763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.037806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.037848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.038302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.094 [2024-07-12 11:00:43.038354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.038399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.038441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.038491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.038919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.038934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.040959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.041899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.043436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.043487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.043533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.043574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.043962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.044020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.044062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.044103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.044144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.044578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.044596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.046725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.046769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.046810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.046851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.047121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.047184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.047226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.047267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.047307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.047582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.047602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.049258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.049309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.049355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.049397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.049670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.049728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.049771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.049812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.049854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.050280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.050296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.052578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.052623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.053945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.053993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.054265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.054324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.054365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.054406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.054460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.054737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.054753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.056449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.056500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.056542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.057588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.058010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.058063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.058107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.058154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.058198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.058641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.058659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.062294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.063838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.064552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.065825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.066096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.067734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.069429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.069828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.070217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.070614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.070630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.074044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.075421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.076737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.078023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.078294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.079845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.080882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.081274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.081666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.082130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.082147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.085370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.086109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.087518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.095 [2024-07-12 11:00:43.089115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.089387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.090921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.091317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.091710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.092096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.092527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.092544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.095733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.096833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.098117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.099632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.099905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.101101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.101502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.101890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.102279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.102717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.102734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.104965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.106367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.107946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.109474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.109775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.110181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.110574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.110960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.111346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.111661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.111677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.114566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.115848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.117386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.118929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.119320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.119738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.120123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.120517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.121187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.121460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.121477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.124380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.125910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.127419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.128799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.129188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.129591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.129980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.130365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.131716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.132066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.132082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.134942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.136473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.138005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.138624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.139093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.139498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.139888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.140478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.141781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.142057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.142073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.145276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.146811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.148007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.148397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.148841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.149239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.149630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.151137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.152462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.152746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.152764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.155916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.157442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.157911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.158301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.158708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.159106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.159869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.161170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.162706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.162980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.162996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.166175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.167338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.167736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.168123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.168567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.168968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.170589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.172039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.173568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.173839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.173855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.176994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.177398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.177793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.096 [2024-07-12 11:00:43.178180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.178604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.179450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.180739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.182285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.183824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.184122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.184138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.186789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.187183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.187573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.187960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.188393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.190040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.191509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.193051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.194649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.195070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.195087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.197140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.197549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.197937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.198326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.198687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.199986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.201533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.203068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.204188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.204465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.204486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.206445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.206843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.207246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.207683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.207953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.209335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.210855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.212376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.213124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.213437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.213454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.215493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.215890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.216280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.217541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.217855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.219412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.220945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.222012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.223628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.223959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.223975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.226130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.226529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.227021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.228390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.228672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.230217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.231809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.232639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.233930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.234204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.234220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.236530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.236923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.238206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.239503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.239773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.241313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.242378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.243964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.245347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.245624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.245641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.248213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.248656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.250081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.251678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.251948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.253486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.254233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.255515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.257039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.257311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.257327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.259845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.261060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.262347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.263889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.264159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.265286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.266856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.268245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.269789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.270060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.270076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.272760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.274185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.097 [2024-07-12 11:00:43.275796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.098 [2024-07-12 11:00:43.277332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.098 [2024-07-12 11:00:43.277608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.098 [2024-07-12 11:00:43.278358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.098 [2024-07-12 11:00:43.279639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.098 [2024-07-12 11:00:43.281176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.098 [2024-07-12 11:00:43.282676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.098 [2024-07-12 11:00:43.282952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.098 [2024-07-12 11:00:43.282969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.286375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.287648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.289180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.290704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.291071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.292548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.293858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.295372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.296889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.297406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.297426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.301585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.303211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.304889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.306584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.306952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.308284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.309841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.311365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.312582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.313004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.313020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.316536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.318080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.319616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.320451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.320729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.322029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.323556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.325089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.325567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.326052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.326070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.329645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.331266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.332948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.333866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.334194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.335784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.337319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.338744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.339133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.339568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.339586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.343148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.344698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.345766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.347266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.347585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.349143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.350669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.351503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.351891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.352315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.352334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.355998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.356724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.357965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.359523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.359797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.360712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.361100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.361491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.361891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.362327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.362344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.365004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.365402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.365804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.366199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.366651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.367069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.367456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.367852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.368251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.368656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.368673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.371277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.371674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.372062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.372450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.372907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.373309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.373706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.374098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.374491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.374934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.360 [2024-07-12 11:00:43.374951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.377549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.377936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.378325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.378722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.379105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.379517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.379907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.380296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.380694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.381080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.381096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.383804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.384216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.384287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.384688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.385133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.385538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.385926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.386315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.386711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.387103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.387119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.389740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.390139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.390532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.390578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.390983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.391383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.391780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.392172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.392572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.393004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.393021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.395257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.395302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.395343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.395400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.395885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.395941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.395983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.396025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.396067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.396480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.396502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.398889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.398934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.398977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.399018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.399459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.399516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.399561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.399603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.399645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.400052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.400068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.402299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.402343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.402385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.402427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.402857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.402908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.402951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.402993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.403042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.403509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.403526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.405762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.405808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.405850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.405893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.406333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.406396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.406438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.406479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.406525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.406985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.407004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.409330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.409377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.409419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.409462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.409867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.409923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.409964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.410006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.410048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.410477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.410497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.412713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.412759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.412802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.412846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.413353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.413405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.413448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.413496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.413539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.413934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.361 [2024-07-12 11:00:43.413950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.416269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.416313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.416355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.416396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.416835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.416888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.416935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.416977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.417021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.417358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.417374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.419708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.419752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.419799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.419840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.420273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.420324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.420367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.420421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.420462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.420893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.420919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.423230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.423276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.423320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.423361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.423774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.423842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.423900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.423943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.423986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.424368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.424385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.426774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.426819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.426861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.426902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.427306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.427369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.427411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.427466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.427513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.427959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.427975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.430733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.430789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.430853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.430907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.431342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.431408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.431463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.431523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.431566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.431955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.431971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.434249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.434294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.434340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.434382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.434724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.434787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.434831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.434872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.434932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.435394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.435411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.437702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.437751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.437805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.437847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.438339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.438424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.438467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.438513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.438554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.438998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.439014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.441285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.441342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.441385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.441427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.441860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.441912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.441953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.441996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.442037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.442485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.442503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.444792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.444837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.444895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.444939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.445405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.445457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.445506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.445549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.445590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.362 [2024-07-12 11:00:43.445984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.446004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.448332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.448376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.448422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.448463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.448887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.448939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.448981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.449024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.449066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.449485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.449501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.451684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.451729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.451773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.451814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.452240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.452296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.452351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.452394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.452435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.452900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.452917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.454813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.454858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.454902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.454951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.455393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.455444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.455491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.455540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.455582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.456008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.456024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.458172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.458217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.458258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.458313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.458774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.458837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.458880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.458921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.458964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.459398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.459414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.461502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.461546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.461615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.461659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.461926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.461991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.462036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.462078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.462118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.462408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.462425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.463972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.464016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.464058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.464098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.464499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.464583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.464627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.464668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.464709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.465138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.465154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.467249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.467295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.467336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.467376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.467649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.467707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.467749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.467790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.467830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.468094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.468109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.469764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.469808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.469848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.469889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.470158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.470216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.470258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.470306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.470347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.470694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.470711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.363 [2024-07-12 11:00:43.473985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.474001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.475658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.475702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.475767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.475811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.476078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.476140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.476185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.476225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.476266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.476547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.476564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.478939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.478984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.479026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.479083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.479350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.479404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.479452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.479498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.479546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.479815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.479831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.481556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.481600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.481641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.481682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.481947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.482006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.482047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.482088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.482128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.482395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.482410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.484595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.484640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.484686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.484728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.485154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.485206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.485251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.485292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.485333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.485618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.485634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.487251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.487296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.487337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.487377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.487697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.487758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.487800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.487840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.487881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.488155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.488170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.490437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.490491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.490534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.490575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.491043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.491097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.491139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.491182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.491224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.491576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.491592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.493124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.493170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.494581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.494626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.494959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.495019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.495060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.495100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.495141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.495412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.495429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.497655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.497713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.497757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.364 [2024-07-12 11:00:43.498146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.498538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.498595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.498642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.498683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.498725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.499080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.499097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.502011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.503556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.505088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.505712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.506197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.506603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.506991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.507571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.508866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.509140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.509156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.512421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.513941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.515393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.515790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.516228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.516667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.517062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.518306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.519607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.519879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.519894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.523097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.524630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.525071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.525460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.525863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.526262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.527123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.528409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.529934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.530209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.530225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.533394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.534599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.534991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.535380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.535792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.536194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.537537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.538841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.540367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.540647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.540664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.543862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.544303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.544697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.545084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.545542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.546275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.547559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.365 [2024-07-12 11:00:43.549070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.550574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.550847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.550863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.553774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.554174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.554568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.554960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.555391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.556606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.557900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.559434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.560959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.561376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.561393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.563537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.563933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.564320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.564712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.565079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.566371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.567907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.569436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.570725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.571052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.571068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.573002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.573401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.573797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.574187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.574458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.575763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.577295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.578829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.579550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.579824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.579844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.581906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.582301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.582692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.583861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.584175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.585715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.587237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.588418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.589888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.590203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.590219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.592296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.592696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.593082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.594747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.595054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.596606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.598141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.598843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.600276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.600552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.600569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.602906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.603300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.604386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.605676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.605948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.607509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.608736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.610178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.611466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.611746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.611762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.614172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.614574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.616250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.617810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.618084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.619633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.620347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.621695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.623237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.623512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.623528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.626110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.627297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.628598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.630127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.630402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.631527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.633099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.634502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.636021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.636295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.636311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.638934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.640651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.642195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.643808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.644081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.644796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.646159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.647698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.649230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.649509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.649526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.652893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.654192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.655719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.657242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.657558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.628 [2024-07-12 11:00:43.658964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.660255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.661791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.663326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.663727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.663743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.667711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.669053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.670591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.672128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.672541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.674138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.675809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.677518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.679131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.679510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.679526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.683030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.684559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.686102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.687215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.687495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.688789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.690305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.691828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.692598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.693074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.693091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.696504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.698046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.699584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.700289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.700565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.702009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.703548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.705128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.705526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.705970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.705987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.709617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.711157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.712462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.713799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.714139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.715688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.717226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.718227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.718625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.719064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.719081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.722591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.724133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.724882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.726307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.726585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.728127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.729662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.730059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.730448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.730834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.730851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.734302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.735593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.736947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.738226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.738504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.740059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.741045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.741449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.741842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.742294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.742310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.745645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.746467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.748042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.749711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.749987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.751547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.751986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.752373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.752763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.753205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.753221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.756210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.757563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.758864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.760392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.760670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.761656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.762059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.762448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.762843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.763284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.763301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.766614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.768175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.769695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.770090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.770538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.770935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.771320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.772160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.773450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.773731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.773748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.776992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.778538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.779686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.780075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.780505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.780907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.781299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.781696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.782088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.782518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.782535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.785191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.785594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.785987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.786376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.786823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.787224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.787626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.788017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.788403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.788877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.788895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.791514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.791928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.792323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.792725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.793109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.793525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.793909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.794295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.794695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.795087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.795104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.797799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.798202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.798612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.799002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.799407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.799815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.800206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.800609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.801008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.801455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.801473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.804106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.804510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.804899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.805285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.805692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.806096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.806494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.806882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.807275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.807719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.807736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.810307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.810699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.811089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.811489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.811924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.812327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.812720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.813111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.813505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.813879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.813896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.816572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.816978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.817026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.817431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.817879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.818281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.818674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.819066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.819467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.819932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.629 [2024-07-12 11:00:43.819949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.822588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.822986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.823372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.823425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.823881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.824289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.824683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.825092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.825504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.825942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.825958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.828493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.828552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.828595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.828636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.829082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.829134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.829178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.829221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.829263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.829616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.829637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.831866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.831911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.831952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.831993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.832432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.832489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.832533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.832576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.832618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.833061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.833077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.835349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.835395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.835438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.835480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.835915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.835971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.836026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.836069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.836110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.836577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.836605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.838894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.838939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.838980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.839024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.839423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.839474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.839522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.839563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.839609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.840047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.840064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.842357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.842403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.842446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.842506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.842974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.843043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.843086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.843127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.843171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.843588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.843604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.846032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.846079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.846121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.846163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.846625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.846677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.846720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.846763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.846805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.847150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.847168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.849451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.849501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.849545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.849589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.850017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.850072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.850117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.850159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.850213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.850562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.891 [2024-07-12 11:00:43.850579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.852840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.852885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.852933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.852975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.853362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.853425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.853494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.853550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.853593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.853963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.853979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.856295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.856342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.856384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.856426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.856784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.856850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.856893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.856935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.857005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.857405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.857421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.859983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.860046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.860099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.860165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.860552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.860616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.860683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.860738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.860793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.861168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.861184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.863492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.863549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.863595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.863636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.863988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.864050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.864093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.864135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.864176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.864635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.864651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.866972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.867018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.867078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.867132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.867529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.867597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.867653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.867696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.867737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.868190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.868207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.870398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.870457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.870514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.870557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.870927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.870993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.871035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.871075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.871116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.871444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.871460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.873727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.873772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.873815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.873857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.874271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.874333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.874376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.874429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.874473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.874870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.874889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.877333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.877378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.877419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.877460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.877775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.877833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.877875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.877915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.877956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.878232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.878248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.879848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.879893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.879937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.879979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.880250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.880309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.880351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.880406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.880450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.880722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.892 [2024-07-12 11:00:43.880738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.883174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.883219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.883262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.883305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.883589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.883647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.883689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.883730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.883778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.884052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.884067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.885707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.885759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.885804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.885845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.886119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.886177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.886222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.886264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.886304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.886578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.886594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.888850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.888896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.888939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.888981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.889400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.889452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.889502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.889544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.889585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.889878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.889894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.891514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.891559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.891604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.891645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.891977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.892036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.892077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.892118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.892159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.892430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.892446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.894548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.894596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.894639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.894680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.895123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.895178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.895221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.895263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.895304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.895617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.895633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.897226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.897271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.897313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.897358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.897631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.897682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.897731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.897775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.897821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.898093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.898109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.900068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.900113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.900155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.900197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.900584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.900646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.900689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.900732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.900774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.901215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.901232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.902773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.902817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.902865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.902910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.903310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.903369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.903411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.903452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.903497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.903813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.903829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.905651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.905696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.905738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.905783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.906222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.906277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.906335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.906379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.906420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.906889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.893 [2024-07-12 11:00:43.906907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.908574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.908632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.908676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.908717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.908987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.909046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.909089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.909131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.909172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.909444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.909464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.911180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.911237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.911279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.911320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.911751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.911805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.911849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.911891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.911933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.912302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.912318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.914149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.914193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.914239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.914280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.914553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.914611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.914652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.914693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.914734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.915163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.915179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.916771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.916824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.916874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.916915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.917308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.917373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.917416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.917458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.917511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.917949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.917966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.919899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.919943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.919984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.920024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.920294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.920353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.920394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.920443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.920492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.920763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.920778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.922461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.922510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.923488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.923549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.924015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.924077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.924121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.924163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.924206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.924641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.924658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.926764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.926823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.926866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.928396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.928669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.928733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.928777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.928818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.928860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.929131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.929147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.931272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.931669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.932065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.932695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.932979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.934680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.936328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.937897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.939006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.939368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.939384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.941529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.941929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.942317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.943884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.944222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.945790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.947323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.948048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.949530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.949801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.949817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.894 [2024-07-12 11:00:43.952125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.952524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.953349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.954643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.954914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.956504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.958010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.959147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.960447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.960722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.960739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.963128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.963528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.965132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.966557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.966828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.968380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.969109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.970587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.972267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.972543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.972559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.975025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.975771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.977059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.978592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.978863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.980499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.981555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.982837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.984370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.984646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.984662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.987279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.988679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.989978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.991504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.991777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.992737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.994431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.995951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.997546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.997818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:43.997834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.000634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.001999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.003530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.005061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.005331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.006131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.007418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.008938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.010470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.010747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.010763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.014243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.015556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.017085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.018617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.018995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.020563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.021958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.023489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.025028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.025432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.025448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.029505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.031191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.032870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.034475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.034837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.036136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.037668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.039203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.040555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.040938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.040954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.044505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.046024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.047567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.048598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.048870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.050178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.895 [2024-07-12 11:00:44.051709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.053241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.053950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.054430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.054447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.057946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.059462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.060988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.061708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.061985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.063656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.065346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.066946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.067336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.067774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.067791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.071407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.072939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.074248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.075608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.075952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.077479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.079016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.079989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.080399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.080834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:08.896 [2024-07-12 11:00:44.080851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.084368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.085895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.086610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.087958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.088229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.089774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.091350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.091746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.092137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.092524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.092541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.095992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.097428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.098640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.099936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.157 [2024-07-12 11:00:44.100211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.101768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.102862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.103250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.103643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.104052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.104069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.107404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.108136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.109551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.111137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.111405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.112939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.113335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.113729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.114119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.114577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.114594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.117807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.119005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.120300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.121833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.122103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.123266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.123662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.124053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.124460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.124892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.124909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.127179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.128596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.130151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.131678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.131950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.132349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.132742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.133130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.133522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.133865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.133881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.136691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.137982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.139518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.141085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.141478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.141905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.142294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.142691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.143248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.143523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.143539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.146485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.147996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.149509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.150709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.151078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.151488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.151875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.152266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.153643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.153964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.153985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.156941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.158488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.160016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.160412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.160884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.161288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.161685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.162359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.163650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.163924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.163939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.167118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.168651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.169707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.170095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.170530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.170934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.171324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.172431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.173962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.174234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.174250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.176629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.177025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.177417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.177813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.178240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.179808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.181501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.183144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.184707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.185109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.185126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.187071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.187467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.187865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.188254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.158 [2024-07-12 11:00:44.188653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.189058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.189450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.189847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.190238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.190673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.190692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.193235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.193641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.194034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.194426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.194828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.195231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.195627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.196014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.196405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.196787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.196804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.199598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.199997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.200388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.200782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.201237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.201655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.202046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.202444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.202849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.203280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.203296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.206066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.206459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.206859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.207257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.207682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.208087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.208491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.208881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.209270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.209705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.209722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.212445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.212850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.213249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.213651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.214131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.214543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.214935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.215325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.215738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.216214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.216231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.219065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.219471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.219864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.220256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.220632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.221035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.221426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.221825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.222237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.222680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.222699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.225443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.225851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.226243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.226643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.227017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.227422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.227824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.228214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.228610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.229027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.229044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.231585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.233023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.233414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.233810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.234222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.234668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.236104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.236498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.237165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.237439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.237455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.241216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.241620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.241668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.242601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.242880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.243285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.243687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.244081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.245000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.245283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.245299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.247951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.248353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.249302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.249351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.249666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.159 [2024-07-12 11:00:44.250073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.250977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.251957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.252356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.252768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.252794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.255027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.255073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.255116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.255174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.255444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.255504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.255546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.255608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.255652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.256100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.256117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.258175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.258222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.258263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.258305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.258743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.258796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.258838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.258882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.258924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.259213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.259230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.261533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.261580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.261630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.261677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.261946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.261997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.262052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.262105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.262146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.262606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.262624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.264854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.264918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.264973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.265015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.265393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.265456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.265506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.265553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.265595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.265866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.265882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.268044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.268093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.268136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.268178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.268615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.268687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.268729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.268785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.268855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.269278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.269296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.271433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.271478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.271525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.271565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.271929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.271989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.272031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.272073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.272115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.272549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.272566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.274654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.274707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.274750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.274795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.275202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.275256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.275296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.275337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.275378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.275657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.275673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.277951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.277997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.278042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.278083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.278462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.278530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.278574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.278617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.278662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.279112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.279132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.281325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.281371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.281427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.281503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.160 [2024-07-12 11:00:44.281880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.281944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.281985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.282026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.282067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.282384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.282401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.284562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.284609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.284665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.284708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.285091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.285153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.285195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.285246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.285299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.285661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.285678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.287691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.287748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.287791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.287832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.288274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.288325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.288368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.288410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.288452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.288854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.288872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.290724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.290770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.290815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.290873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.291339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.291391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.291433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.291476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.291524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.291926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.291946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.294176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.294234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.294277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.294319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.294702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.294768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.294811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.294851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.294895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.295338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.295356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.297728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.297779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.297834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.297875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.298332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.298384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.298426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.298468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.298518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.298883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.298898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.300824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.300869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.300908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.300949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.301215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.301274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.301315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.301356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.301403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.301795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.301812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.303432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.303480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.303533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.303575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.303984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.304044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.304087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.304131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.304174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.304610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.304627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.308245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.308296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.308346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.308388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.308711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.308771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.308812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.308853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.308894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.309203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.309219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.161 [2024-07-12 11:00:44.313317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.313368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.313411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.313454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.313826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.313884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.313926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.313967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.314007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.314312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.314327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.318989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.319038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.319081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.319131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.319401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.319453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.319507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.319552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.319594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.320053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.320070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.323674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.323722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.323768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.323808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.324075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.324131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.324173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.324214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.324263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.324688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.324704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.329224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.329276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.329318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.329379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.329874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.329931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.329974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.330017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.330059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.330457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.330473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.334973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.335851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.338899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.338949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.338991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.339031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.339297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.339355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.339397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.339437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.339478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.339755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.339771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.343908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.343963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.344473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.344523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.344565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.162 [2024-07-12 11:00:44.344946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.415362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.423665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.423733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.425423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.425477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.427089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.427153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.428761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.429030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.429047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.429062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.438888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.440409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.441929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.442204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.442220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.446225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.447561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.449093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.450614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.452687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.454232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.455815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.457488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.457910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.457932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.461544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.463057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.464563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.465626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.467182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.468692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.470195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.471005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.471474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.471497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.475095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.422 [2024-07-12 11:00:44.476713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.478402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.479350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.481232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.482758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.484207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.484601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.485028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.485045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.488564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.490079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.490938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.492539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.494299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.495817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.496467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.496859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.497284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.497300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.500927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.502417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.503579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.504850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.506624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.507852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.508238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.508629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.509066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.509083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.512408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.513156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.514510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.516049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.517858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.518253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.518647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.519033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.519469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.519492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.522419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.523832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.525101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.526626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.527723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.528234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.528627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.529013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.529451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.529468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.532568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.533793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.535063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.536557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.538037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.538427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.538815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.539204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.539635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.539652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.541945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.543286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.544830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.546361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.547026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.547416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.547807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.548195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.548515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.548533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.551740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.553144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.554663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.556197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.556985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.557377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.557807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.558727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.559045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.559061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.561936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.563492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.565024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.565925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.566736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.567126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.567521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.569120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.569393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.569409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.572628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.574305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.576001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.576390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.577205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.577601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.578560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.579836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.580107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.423 [2024-07-12 11:00:44.580123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.583253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.584782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.585781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.586173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.586994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.587379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.589022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.590479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.590754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.590770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.593035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.594533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.594922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.595607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.596267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.596892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.598181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.599715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.599986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.600002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.603209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.604445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.604842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.605229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.606081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.607550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.608830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.610379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.610657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.610673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.613899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.424 [2024-07-12 11:00:44.614614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.615003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.615388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.616209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.616604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.617007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.617394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.617804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.617821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.620555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.620947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.621340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.621739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.622569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.622962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.623349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.623739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.624126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.624142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.626785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.627180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.627573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.627960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.628760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.629149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.629539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.629929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.630360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.630376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.633144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.633546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.633935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.634338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.635162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.635562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.635953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.636344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.636812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.636832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.639473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.639870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.640258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.640666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.641448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.641840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.642230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.642627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.643163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.643179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.645956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.646358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.646756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.647146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.647939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.685 [2024-07-12 11:00:44.648331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.648729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.649117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.649547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.649565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.652048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.652440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.652833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.653217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.653985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.654377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.654768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.655152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.655596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.655613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.658206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.658605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.659007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.659064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.659911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.660301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.660691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.661082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.661453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.661470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.664250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.664309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.664709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.664757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.665553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.665597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.665985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.666027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.666470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.666492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.669024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.669082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.669470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.669529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.670280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.670332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.670728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.670771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.671207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.671222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.674592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.674645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.675036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.675083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.675858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.675908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.676295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.676337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.676789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.676805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.679378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.679429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.679821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.679865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.680706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.680757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.681152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.681199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.681692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.681712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.684316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.684367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.684758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.684801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.685559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.685604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.685996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.686043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.686436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.686452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.689174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.689227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.689621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.689670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.690511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.690557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.690942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.690985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.691428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.691444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.694184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.694235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.694630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.694679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.695338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.695390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.696864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.696912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.697315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.697332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.700088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.700145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.700188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.686 [2024-07-12 11:00:44.700225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.701049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.701095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.701137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.701531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.701986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.702002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.704343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.704395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.704436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.704477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.704953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.705000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.705043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.705087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.705433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.705449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.707895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.709897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.709944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.709986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.710028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.710468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.710516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.710558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.710600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.711026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.711045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.712607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.712651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.712692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.712736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.713164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.713214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.713255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.713295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.713613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.713630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.715378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.715424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.715466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.715515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.715983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.716027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.716076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.716118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.716585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.716602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.718273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.718324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.718369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.718410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.718721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.718765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.718807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.718850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.719121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.719137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.720803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.720862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.720904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.720945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.721434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.721479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.721530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.721572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.721948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.721964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.723871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.723915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.723956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.724000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.724309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.724351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.724393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.724433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.724929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.724946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.726485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.726533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.726582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.726624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.727039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.727082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.727123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.727165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.727606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.727623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.729620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.729665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.687 [2024-07-12 11:00:44.729705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.729745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.730058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.730101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.730150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.730200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.730468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.730489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.732144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.732188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.732229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.732270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.732660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.732704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.732759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.732801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.733271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.733288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.735399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.735448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.735495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.735856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.735899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.735940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.736209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.866004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.871715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.871783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.873283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.873339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.874598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.874652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.876068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.876371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.876387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.688 [2024-07-12 11:00:44.876406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.884167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.884865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.886259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.886535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.886551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.888866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.889254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.890207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.891495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.893319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.894792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.895974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.897244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.897521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.897537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.899838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.900230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.901853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.903295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.905112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.905885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.907393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.909095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.909372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.909388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.911813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.912650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.913946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.915473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.917344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.918457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.919751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.921277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.921554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.921570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.924171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.925817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.927275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.928807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.929856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.931359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.933045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.934668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.934942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.934958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.937948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.939254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.940768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.942285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.943525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.944826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.946350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.947866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.948196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.948213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.951946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.953247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.954772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.956303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.958293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.959763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.961305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.962918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.963339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.963355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.967402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.969092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.970757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.971418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.973327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.974780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.975173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.975563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.975964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.975980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.979308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.950 [2024-07-12 11:00:44.980449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.981961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.983294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.985103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.985987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.986386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.986778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.987244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.987260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.990449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.991094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.992762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.994237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.996059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.996754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.997150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.997542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.998000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:44.998016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.000634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.001029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.001421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.001474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.002338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.002732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.003119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.003517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.003892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.003909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.006446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.006500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.006891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.006939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.007679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.007729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.008117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.008169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.008627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.008645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.011176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.011226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.011619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.011663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.012474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.012535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.012948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.012995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.013433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.013450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.016100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.016152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.016540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.016928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.017746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.018145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.018200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.018593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.019013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.019030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.021224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.021627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.022016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.022061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.022852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.022898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.023286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.023682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.024082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.024098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.026771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.027164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.027210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.027600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.028082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.028470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.028878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.951 [2024-07-12 11:00:45.028937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.029304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.029321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.032254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.032312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.032704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.033091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.033926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.034316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.034366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.034772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.035168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.035184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.037665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.038068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.038465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.038518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.039335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.039380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.039775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.040167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.040543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.040559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.043272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.043676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.043730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.044117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.044607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.044995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.045380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.045422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.045778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.045795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.048417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.048468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.048863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.049258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.050110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.050506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.050554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.050941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.051353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.051369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.053665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.054058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.054452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.054852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.055700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.055748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.056135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.056529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.056956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.056973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.059363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.059764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.059819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.060209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.060641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.061030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.061072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.061458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.061894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.061914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.064400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.064450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.064843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.064885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.065650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.065702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.066088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.066130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.066553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.066571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.069324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.069374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.069766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.069809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.070641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.070698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.071094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.071163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.952 [2024-07-12 11:00:45.071618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.071635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.074390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.074453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.074855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.074903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.075738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.075784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.076170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.076211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.076581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.076601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.078811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.078861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.079244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.079286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.080106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.080156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.080554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.080603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.081055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.081072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.083677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.083729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.084132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.084175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.085001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.085056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.085443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.085501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.085917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.085933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.089587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.089638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.090026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.090070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.091721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.091769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.093295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.093339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.093615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.093636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.096779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.096827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.097660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.097705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.098545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.098605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.098992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.099035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.099440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.099456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.101692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.101741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.103032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.103076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.104875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.104923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.106076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.106121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.106397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.106412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.108825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.108872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.110513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.110566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.112365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.112414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.113949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.113994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.114478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.114498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.116396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.116444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.116489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.116525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.117323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.117367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.117408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.117795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.118072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.118087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.119726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.119771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.953 [2024-07-12 11:00:45.119816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.119857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.120169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.120213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.120258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.120299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.120571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.120588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.122238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.122283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.122324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.122367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.122840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.122885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.122929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.122971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.123243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.123259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.125264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.125308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.125350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.125393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.125710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.125753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.125815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.125859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.126127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.126144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.127777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.127821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.127861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.127902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.128265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.128308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.128365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.128407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.128883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.128900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.130987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.131039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.131085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.131130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.131437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.131494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.131536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.131577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.131846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.131862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.133512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.133564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.133605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.133646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.133955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.133998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.134039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.134080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.134463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.134479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.136404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.136450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.136495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.137020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.137064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.137108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:09.954 [2024-07-12 11:00:45.137453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.195321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.198793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.200357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.200408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.204227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.205089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.205134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.205175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.207707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.208766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.208811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.208852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.209217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.210730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.210780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.210831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.212336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.212381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.212823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.212840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.214867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.214916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.215305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.215348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.215797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.216185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.216228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.217559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.217900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.217916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.224043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.224097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.224718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.224764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.225081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.225473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.225522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.226301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.226585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.226601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.230088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.230139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.231537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.231581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.231943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.233224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.233271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.234794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.235065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.235081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.239863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.239918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.241446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.241495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.241810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.242526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.242572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.243860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.244130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.244146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.246135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.246188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.246754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.246800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.247119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.247515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.247558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.248599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.248908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.248924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.254733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.213 [2024-07-12 11:00:45.254788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.255671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.255725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.256242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.256635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.256684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.257066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.257495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.257512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.259722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.259771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.261059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.261105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.261422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.262943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.262988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.264083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.264357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.264373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.269101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.269155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.270691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.270737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.271050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.271773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.271819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.273105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.273378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.273394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.275660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.275708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.276094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.276129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.276498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.277778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.277828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.279360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.279634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.279651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.283875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.285514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.285902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.285946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.286699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.286976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.287037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.287427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.288352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.288397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.288738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.288754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.291622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.293036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.294558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.295626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.296068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.296465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.296855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.297241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.298896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.299206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.299221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.305469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.306110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.307459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.307849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.308237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.309711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.310101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.310934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.312226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.312502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.312518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.315650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.317177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.318363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.318757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.319186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.319585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.319971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.321524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.322916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.323186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.323202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.328537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.329881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.330269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.331143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.331423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.331834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.332735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.334026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.335560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.335830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.335845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.339016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.340147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.340537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.340924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.341377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.341781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.343443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.344923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.346488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.346760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.346776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.352056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.352454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.353440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.354334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.354769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.355729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.357003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.358420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.359566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.359842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.359859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.362014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.362412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.362806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.364397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.364678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.366227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.367755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.368486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.369781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.370053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.370074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.376061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.376115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.376505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.377681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.377994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.379556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.381079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.381125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.382116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.382388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.382404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.384062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.384467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.384861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.384905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.385378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.385871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.385917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.387206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.388751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.389025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.389041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.394192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.395453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.395505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.395895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.396225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.396279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.397303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.397710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.397755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.398091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.398107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.400669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.400727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.401119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.401636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.401913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.402315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.402737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.404186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.404232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.404686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.214 [2024-07-12 11:00:45.404703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.474 [2024-07-12 11:00:45.407801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.474 [2024-07-12 11:00:45.408201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.474 [2024-07-12 11:00:45.408602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.474 [2024-07-12 11:00:45.409532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.474 [2024-07-12 11:00:45.409822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.474 [2024-07-12 11:00:45.410222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.474 [2024-07-12 11:00:45.411041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.412095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.412140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.412576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.412593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.414876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.415270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.415662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.416052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.416389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.417446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.418272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.418665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.418711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.418992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.419008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.422313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.422714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.423107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.423499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.423855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.425350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.425747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.426136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.426183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.426457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.426473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.428899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.429292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.429682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.430072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.430491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.430895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.431409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.432775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.432820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.433263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.433279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.436589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.436986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.437375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.437424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.437874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.438274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.438678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.439848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.439896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.440282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.440299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.442788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.443183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.443579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.443626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.444063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.444458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.444856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.445247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.445301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.445659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.445677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.451971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.452374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.452769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.452817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.453240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.453643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.454030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.454426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.454485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.454981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.454997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.457234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.457634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.458034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.458091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.458524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.458926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.459311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.459704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.459751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.460127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.460142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.463056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.463456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.463510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.463900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.464333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.464740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.465129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.465523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.465578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.465963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.465980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.469229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.469901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.469949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.470339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.470741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.471145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.471542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.471931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.471977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.472444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.472461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.476749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.478312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.478707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.478754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.479164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.479571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.479967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.480015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.480402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.480801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.480819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.484437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.484491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.484880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.484923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.485242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.486404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.486450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.486843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.486887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.487261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.487277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.490749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.490806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.492252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.492297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.492769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.493173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.493222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.494878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.494932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.495384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.495401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.475 [2024-07-12 11:00:45.497951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.498004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.498388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.498431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.498837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.499239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.499304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.499948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.499997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.500270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.500287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.503208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.503265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.504501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.504552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.504983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.505380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.505425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.505814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.505858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.506293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.506311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.508599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.508650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.510116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.510161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.510630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.511034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.511078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.511472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.511534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.511996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.512011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.518342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.518404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.518904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.518952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.519227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.519640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.519686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.520441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.520491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.520773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.520789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.524175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.524226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.525479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.525528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.525797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.527001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.527048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.528583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.528629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.528898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.528914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.532285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.532349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.533749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.533796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.534067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.535725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.535785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.537290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.537333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.537654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.537670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.540235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.540287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.540916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.541307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.541585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.542325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.542371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.542763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.542809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.543077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.543093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.546834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.546888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.548423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.548470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.548746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.549723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.549770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.549811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.550372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.550811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.550833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.552744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.552789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.552831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.552879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.553148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.553202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.553243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.553287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.553339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.553609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.553625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.558502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.558554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.558596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.558637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.558910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.558965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.559006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.559047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.559096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.559562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.559579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.561803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.561852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.561893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.561934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.562226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.562291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.562335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.562376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.562421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.562696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.562713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.567159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.567208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.567248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.567287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.567725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.567801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.567846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.567887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.567927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.568199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.568215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.570424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.570470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.570519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.476 [2024-07-12 11:00:45.570561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.570895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.570948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.570990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.571030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.571071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.571376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.571391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.576208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.576258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.576307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.576354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.576627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.576685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.576733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.576775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.576817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.577123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.577139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.579081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.579126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.579173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.579215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.579646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.579699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.579741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.579790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.579832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.580106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.580122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.584063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.585604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.585653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.585694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.586003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.586065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.586108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.587584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.587630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.588101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.588118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.591780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.591831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.591880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.593557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.593833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.593893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.595453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.595511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.595555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.595903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.595920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.600015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.600069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.600461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.600524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.600797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.601292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.601338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.601380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.602003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.602277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.602295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.603935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.605215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.605263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.605304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.605583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.605640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.605683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.605724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.607256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.607673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.607692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.614750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.614808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.614857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.614906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.615178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.615231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.615281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.615326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.616854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.617127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.617143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.620214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.620265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.620307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.620348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.620632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.620689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.620731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.620772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.621165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.621604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.621623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.628557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.628614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.628654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.628705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.629066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.629122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.629164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.629205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.630486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.630765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.630780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.632819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.632870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.632913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.632965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.633242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.633297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.633340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.633389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.633787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.634207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.634223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.640530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.640587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.640633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.642162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.642452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.642520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.642564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.642609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.644203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.644663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.644682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.477 [2024-07-12 11:00:45.646913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.646959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.647005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.648515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.648789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.648845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.648891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.648948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.650486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.650761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.650777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.654964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.655015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.655055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.655508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.655951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.656006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.656049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.656106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.657750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.658243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.658260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.659967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.660012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.660053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.661722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.662089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.662144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.662186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.662227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.663501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.663778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.478 [2024-07-12 11:00:45.663794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.668033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.668084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.668682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.668727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.669164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.669217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.669260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.669305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.670789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.671065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.671081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.672743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.672793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.674310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.674355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.674636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.674697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.674738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.674779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.675829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.676109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.676125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.680756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.680811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.680852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.682367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.682650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.682710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.682751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.683459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.683508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.683810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.683827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.687987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.688979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.689030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.689735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.690176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.690229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.691573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.691620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.692959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.693233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.693250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.698004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.698636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.698682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.699757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.700187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.700241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.701101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.701148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.701985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.702424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.702441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.706891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.708190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.708238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.709762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.710042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.710100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.710706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.710752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.711941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.712411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.712432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.715918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.717463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.717516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.718544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.718820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.718876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.720167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.720214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.721744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.722021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.722037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.725070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.725764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.725811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.727104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.727377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.727437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.728957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.729004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.730131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.730408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.730424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.734317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.734723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.734775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.736331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.736818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.736873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.737377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.737422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.738708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.738983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.738998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.743409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.744643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.744692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.746199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.746659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.746723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.747112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.736 [2024-07-12 11:00:45.747170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.748823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.749339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.749356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.753064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.754366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.754413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.755941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.756213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.756274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.757513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.757558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.759018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.759455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.759472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.761955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.763487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.763535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.763578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.763852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.763915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.764711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.764757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.766055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.766331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.766347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.771406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.771804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.771852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.773490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.773800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.773862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.775388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.776908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.776955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.777334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.777350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.782339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.783186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.784224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.784620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.784946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.786232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.787763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.789299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.790339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.790618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.790635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.794791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.796524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.796919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.797561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.797838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.799401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.801025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.802742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.803704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.804019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.804035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.810029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.810703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.811094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.812822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.813114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.814660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.816188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.816888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.818291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.818574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.818592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.824855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.825255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.826274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.827584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.827859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.829408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.830697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.832038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.833309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.833591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.833608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.837943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.838345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.839826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.841381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.841662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.842323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.843981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.845658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.847228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.847660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.847677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.853277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.854808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.856339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.857378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.857660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.858951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.860488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.862004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.862726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.863000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.863017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.867006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.868548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.870121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.870955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.871282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.872937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.874512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.876035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.877070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.877423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.877440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.881458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.881520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.883039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.883812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.884091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.885384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.886931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.886979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.888502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.888946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.888963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.894222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.894642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.895256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.895304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.895588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.895994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.896041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.896902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.897916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.898344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.898360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.903070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.903473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.903526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.903917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.904259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.904323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.905525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.906198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.906242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.906694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.906713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.911260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.911315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.911710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.913339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.913837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.914238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.914640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.915031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.737 [2024-07-12 11:00:45.915078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.915353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.915369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.918140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.918829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.920018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.920404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.920774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.922081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.922471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.922867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.922915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.923312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.923329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.927641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.928043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.738 [2024-07-12 11:00:45.928437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.929726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.930129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.930543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.931758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.932418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.932465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.932902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.932921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.935765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.936220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.936620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.937016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.937379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.938914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.939312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.939706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.997 [2024-07-12 11:00:45.939755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.940030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.940046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.943028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.943653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.944886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.945280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.945674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.946093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.946835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.947970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.948016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.948460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.948477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.952019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.952841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.953236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.953281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.953571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.954290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.954690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.955080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.955146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.955583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.955601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.961654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.962059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.963721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.963777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.964271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.964678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.966375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.966766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.966811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.967225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.967243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.970241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.970648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.971039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.971088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.971495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.973142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.973554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.973945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.973993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.974268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.974288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.977620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.979176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.979570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.979616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.980019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.980425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.980917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.982302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.982350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.982801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.982819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.987467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.987870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.987919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.988728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.989009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.989415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.989813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.990209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.990258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.990604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.990621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.995494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.996204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.996252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.997245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.997668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.998359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.999555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.999940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:45.999992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:46.000357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:46.000373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:46.002939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:46.003338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:46.003733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.998 [2024-07-12 11:00:46.003783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.004117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.005157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.005998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.006048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.006436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.006722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.006738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.010189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.010245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.011650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.011699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.012166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.012611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.012670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.013060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.013122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.013501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.013518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.019505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.019564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.020085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.020132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.020407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.020824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.020878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.021629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.021677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.021972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.021989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.025761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.025815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.026204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.026247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.026540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.027513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.027561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.027949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.027990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.028358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.028374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.031320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.031375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.031784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.031838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.032243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.033773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.033825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.034221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.034268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.034683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.034700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.039161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.039216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.039915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.039965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.040245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.040654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.040699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.041089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.041137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.041478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.041500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.047610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.047667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.049006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.049051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.049325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.050869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.050916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.051634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.051679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.052024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.052040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.057317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.057374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.058085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.058129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.058579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.059919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.059966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.061282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.061326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.061603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.061619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.067100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.067154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:10.999 [2024-07-12 11:00:46.068142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.068186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.068623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.069558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.069604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.070384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.070429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.070870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.070890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.076882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.076938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.078638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.080274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.080656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.082019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.082064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.082451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.082498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.082820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.082836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.086892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.086941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.087878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.087926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.088200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.089530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.089577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.089619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.091298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.091582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.091599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.095229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.095280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.095324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.095366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.095650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.095706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.095748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.095789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.095837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.096111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.096126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.100649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.100702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.100742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.100783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.101071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.101131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.101174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.101216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.101273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.101552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.101568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.104282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.104337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.104394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.104435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.104711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.104776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.104821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.104867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.104908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.105180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.105196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.109592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.109650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.109692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.109733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.110005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.110064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.110106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.110147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.110188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.110612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.110629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.114515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.114572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.114613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.114654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.114926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.114985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.115026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.115067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.115108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.000 [2024-07-12 11:00:46.115623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.115640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.120045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.120094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.120137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.120180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.120629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.120682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.120725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.120780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.120825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.121102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.121118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.124891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.124955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.125003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.125045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.125344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.125398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.125439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.125479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.125534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.125817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.125832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.130806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.132114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.132158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.132200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.132621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.132680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.132723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.133971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.134016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.134325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.134341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.140550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.140605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.140660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.141631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.141938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.141997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.142387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.142430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.142474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.142770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.142785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.146614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.146664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.147383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.147427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.147746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.149397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.149444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.149489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.151026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.151302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.151318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.156297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.157926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.157980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.158022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.158299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.158353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.158402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.158449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.159973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.160244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.160264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.165267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.165322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.165366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.165407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.165846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.165913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.165958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.165999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.167281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.167723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.167740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.173053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.173108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.173150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.173191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.173463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.173527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.001 [2024-07-12 11:00:46.173570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.173611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.175167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.175554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.175571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.181807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.181863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.181905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.181946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.182221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.182281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.182323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.182363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.183904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.184281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.002 [2024-07-12 11:00:46.184297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.189534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.189589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.189631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.189673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.189990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.190044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.190085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.190126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.190693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.191132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.191148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.197452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.197515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.197564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.199093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.199372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.199434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.199477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.199526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.200868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.201253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.201270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.203797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.203846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.203887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.205513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.205791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.205860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.205902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.205951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.207050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.207412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.207429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.211399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.211449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.211497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.212143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.212417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.212495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.212542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.212584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.214102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.214378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.214394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.218896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.218953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.218996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.219385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.219802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.219856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.219898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.219940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.220329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.220701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.220720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.224545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.224595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.226181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.226238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.226517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.226571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.226621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.226663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.227053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.227494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.227511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.231021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.231080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.231806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.231852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.232154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.232230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.232273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.232313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.233848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.234119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.234134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.239267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.239321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.239362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.240907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.241183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.241243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.241285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.242096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.242141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.242460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.242476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.246342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.246749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.246795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.248421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.248702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.248759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.250271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.250317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.251947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.252448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.252464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.257279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.257680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.257724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.258110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.258384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.258439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.259728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.259775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.261303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.261585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.261601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.265712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.266112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.266156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.266548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.266990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.267044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.268339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.268384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.269677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.269957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.269973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.274797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.275198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.275242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.275634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.276049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.276101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.276493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.276537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.277960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.278266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.278281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.283145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.284685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.284733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.285127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.285585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.285641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.263 [2024-07-12 11:00:46.286029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.286071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.286456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.286775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.286791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.290689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.292214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.292260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.293810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.294195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.294256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.294651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.294701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.295087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.295519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.295537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.299769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.301057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.301103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.302620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.302895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.302953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.303587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.303632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.304018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.304425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.304441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.308043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.309286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.309332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.310615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.310890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.310948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.312475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.312524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.313082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.313582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.313598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.317385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.318963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.319010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.319052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.319355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.319410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.320691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.320737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.322265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.322539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.322556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.327716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.329325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.329381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.330414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.330730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.330792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.332223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.333467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.333526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.333957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.333974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.339422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.340131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.341581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.343213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.343489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.345036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.345428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.345823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.346211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.346644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.346662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.352177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.353714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.355257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.355923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.356376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.356781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.357185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.357579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.359130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.359402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.359417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.365493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.365896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.366285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.366677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.367109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.368128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.369425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.370981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.372519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.372838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.372855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.377827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.378224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.378616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.379009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.379401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.379810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.380198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.380591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.380985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.381412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.381432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.384804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.385201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.385593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.385982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.386340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.386747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.387138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.387525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.387912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.388341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.388357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.391989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.392392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.392790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.393177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.393642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.394043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.394440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.394853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.395243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.395627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.395643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.398321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.398723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.399112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.399502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.399855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.400258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.400651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.401039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.401434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.401804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.401821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.404501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.404559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.404946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.405345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.405837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.406236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.406626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.406671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.407061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.407510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.407526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.409984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.410379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.410775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.410824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.411267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.411669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.411713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.412100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.412490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.412902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.412918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.415602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.416001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.416049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.416442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.416934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.416993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.417384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.417775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.417821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.418215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.418232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.420817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.420868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.421259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.421654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.422088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.422492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.422878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.423265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.423311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.423747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.423764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.426083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.426477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.426875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.427271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.427758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.428159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.264 [2024-07-12 11:00:46.428552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.428943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.429004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.429430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.429446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.431870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.432266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.432664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.433058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.433499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.433899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.434287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.434682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.434731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.435177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.435193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.438000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.438403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.438803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.439190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.439594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.439996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.440387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.440795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.440850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.441274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.441291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.443555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.443947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.444336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.444734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.445171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.445574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.445966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.446357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.446405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.446886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.446903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.449294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.449698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.450086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.450131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.450533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.450932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.451360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.452797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.452846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.453227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.265 [2024-07-12 11:00:46.453242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.455685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.456079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.456473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.456524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.456958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.457355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.457750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.458134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.458200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.458661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.458678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.461302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.461702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.462110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.462157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.462635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.463035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.463419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.463814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.463862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.464253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.464269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.467891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.469185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.470723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.470770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.471043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.471960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.473590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.475188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.475243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.475520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.475537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.478087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.479091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.479138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.480432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.480711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.482262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.483392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.484935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.484990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.485267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.485284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.487546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.487933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.487978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.488898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.489223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.490774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.492317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.493709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.493754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.494045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.494061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.495805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.496201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.496592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.496635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.497092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.497706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.498999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.499044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.500571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.500846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.500861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.504058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.504108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.504995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.505057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.505554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.505953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.506003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.506389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.506432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.506842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.506859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.509213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.509262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.510564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.510611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.510892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.512444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.512496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.513343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.513388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.513897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.513915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.517664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.517738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.519273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.519318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.519595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.520575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.520622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.521914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.521959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.522231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.522247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.524717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.524766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.525308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.525351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.525640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.527323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.527379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.528913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.526 [2024-07-12 11:00:46.528958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.529231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.529247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.532106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.532157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.532560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.532605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.533042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.533448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.533497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.534079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.534123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.534417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.534433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.537335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.537387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.538898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.538943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.539218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.540450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.540504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.540893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.540945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.541384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.541401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.545158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.545217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.546815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.546861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.547178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.548464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.548517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.550052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.550099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.550372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.550389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.553053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.553105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.554492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.554539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.554811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.556464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.556536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.558040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.558086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.558431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.558448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.560391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.560441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.560835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.561240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.561687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.563273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.563326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.564909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.564962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.565234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.565250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.566916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.566961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.568499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.568547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.568888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.569291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.569337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.569380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.569780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.570259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.570276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.572905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.574568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.574618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.574662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.574703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.575178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.575231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.575275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.575318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.575360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.575759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.575776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.577673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.577718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.577759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.577800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.578071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.578130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.578177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.578218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.578259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.527 [2024-07-12 11:00:46.578753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.578770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.580310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.580357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.580401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.580443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.580851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.580902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.580946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.580989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.581031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.581467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.581488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.583689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.583737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.583788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.583830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.584100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.584165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.584207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.584249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.584290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.584627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.584645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.586202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.586247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.586288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.586334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.586731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.586791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.586833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.586875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.586916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.587341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.587358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.589369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.589415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.589459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.589504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.589776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.589834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.589876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.589917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.589958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.590226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.590243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.591923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.593533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.593578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.593620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.594049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.594103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.594147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.594540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.594584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.595021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.595038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.598322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.598377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.598419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.599192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.599500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.599566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.601078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.601124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.601165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.601439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.601455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.603705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.603751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.604140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.604188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.604459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.605732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.605779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.605819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.607442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.607725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.607741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.609509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.611043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.611090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.611136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.611550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.611602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.611646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.611688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.612076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.612584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.612601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.615817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.615868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.615918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.615963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.616337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.616393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.616435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.616477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.528 [2024-07-12 11:00:46.617755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.618032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.618048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.620429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.620478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.620527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.620571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.621009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.621064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.621106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.621147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.622428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.622709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.622726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.625898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.625948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.625989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.626029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.626299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.626357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.626400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.626444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.627049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.627531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.627550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.631243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.631310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.631355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.631395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.631671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.631729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.631771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.631812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.632900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.633174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.633191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.635164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.635213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.635262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.635653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.636103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.636156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.636200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.636246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.637224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.637561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.637578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.639195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.639239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.639280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.640676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.640953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.641012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.641054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.641094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.642721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.643259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.643275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.645776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.645830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.645870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.647164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.647439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.647502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.647544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.647585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.649127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.649526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.649543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.651089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.651134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.651178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.651580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.652020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.652074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.652117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.652159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.652552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.652997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.653014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.654591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.654635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.655409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.655454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.655730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.655789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.655834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.655879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.657408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.657683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.657699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.659883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.659928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.660318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.660361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.660641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.660696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.660738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.660787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.529 [2024-07-12 11:00:46.662348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.662626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.662642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.665832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.665889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.665935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.667332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.667740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.667792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.667834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.668222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.668265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.668702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.668719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.670567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.672104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.672149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.672886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.673164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.673224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.674829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.674874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.676401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.676678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.676695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.678961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.680286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.680331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.681643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.681916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.681974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.683510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.683554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.684272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.684547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.684564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.686227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.686625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.686668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.687055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.687506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.687560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.688786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.688831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.690114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.690384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.690400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.692050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.693591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.693635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.695318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.695749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.695809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.696200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.696243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.696635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.697064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.697080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.698735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.699719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.699766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.701294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.701575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.701636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.702427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.702471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.702862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.703276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.703292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.705305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.706961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.707006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.708396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.708713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.708769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.710060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.710106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.711628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.711899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.711915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.714171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.714571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.714617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.530 [2024-07-12 11:00:46.715924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.531 [2024-07-12 11:00:46.716195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.531 [2024-07-12 11:00:46.716246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.717955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.718007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.719464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.719809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.719826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.721388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.721788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.721832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.722218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.722673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.722727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.723114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.723160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.724644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.724919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.724935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.726658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.728185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.728231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.728276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.728550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.728609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.729863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.729910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.730296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.730742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.730760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.734321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.735851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.735898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.736293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.736569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.736629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.738146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.739763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.739818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.791 [2024-07-12 11:00:46.740095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.740111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.742843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.743242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.743640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.744028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.744453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.744864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.745252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.745652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.746044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.746467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.746487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.749259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.749670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.750058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.750446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.750834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.751239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.751633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.752021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.752406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.752856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.752873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.755515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.755909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.756303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.756697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.757140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.757544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.757928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.758316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.758714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.759132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.759149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.761827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.762217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.762610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.763009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.763438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.763838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.764229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.764639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.765027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.765498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.765516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.768118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.768513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.768904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.769304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.769688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.770087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.770475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.770867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.771252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.771633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.771650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.774345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.774748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.775134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.775521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.775914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.776311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.776707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.777098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.777491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.777920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.777937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.780666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.781063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.781451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.781846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.782286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.782693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.783083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.783475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.783872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.784287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.784303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.786888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.786937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.787328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.787727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.788163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.788565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.788953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.789000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.789382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.789752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.789769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.792035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.792426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.792826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.792874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.793297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.793712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.793760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.794146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.794537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.794944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.794960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.797635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.798026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.798075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.798466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.798878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.798939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.799326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.799719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.799763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.800207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.800224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.802862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.802913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.803299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.803700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.804194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.804616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.805003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.805389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.792 [2024-07-12 11:00:46.805434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.805865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.805882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.808210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.808604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.808996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.809383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.809811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.810213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.810605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.810992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.811039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.811345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.811361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.813331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.813725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.814117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.814507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.814987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.815388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.815784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.816167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.816212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.816677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.816695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.819050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.819443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.819837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.820226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.820598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.820998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.821387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.821774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.821821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.822252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.822269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.824460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.824856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.825247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.825647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.826123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.826544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.826932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.827328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.827375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.827799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.827815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.829353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.830224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.831505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.831550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.831821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.833367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.834653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.835039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.835084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.835532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.835549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.839046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.840585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.841389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.841436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.841715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.843163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.844706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.846341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.846391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.846752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.846769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.850400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.851937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.853460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.853514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.853786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.854840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.856121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.857649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.857701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.857975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.857991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.860604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.862178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.863561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.863608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.863881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.865432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.866156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.867503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.867550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.867823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.867839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.870121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.870522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.870569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.871684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.872029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.873580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.875121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.876317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.876364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.876646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.876663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.878680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.879072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.879118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.879511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.879935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.881506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.883217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.884892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.884939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.885212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.885228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.886893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.888232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.888630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.888679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.889112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.889519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.889912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.889958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.891527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.891878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.891894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.894780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.894829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.896354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.896401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.896678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.897226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.897269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.897659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.897703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.898094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.898110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.901407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.901457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.902687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.902737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.903009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.904299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.904345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.905857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.905902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.906174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.906190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.908756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.908806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.910424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.910470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.910749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.912293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.912339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.914047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.914100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.914480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.914500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.916370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.916416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.916811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.916854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.917299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.793 [2024-07-12 11:00:46.917702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.917748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.919396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.919449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.919724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.919740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.922861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.922921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.924503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.924548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.924820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.925222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.925270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.925662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.925705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.926077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.926093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.929478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.929533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.930625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.930668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.930941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.932237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.932284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.933815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.933860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.934130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.934146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.936856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.936906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.938199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.938245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.938522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.940196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.940243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.941649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.941693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.942005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.942026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.943956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.944009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.944396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.944439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.944917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.945315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.945362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.947046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.947099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.947369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.947385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.950585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.950642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.952183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.953645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.954061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.954462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.954512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.954899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.954942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.955378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.955394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.956956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.957009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.957759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.957805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.958076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.959565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.959611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.959666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.961189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.961461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.961477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.963849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.963895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.963937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.963979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.964286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.964340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.964381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.964421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.964462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.964791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.964807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.966411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.966461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.966516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.966561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.966832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.966890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.966935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.966976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.967017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.967289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.967304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.969504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.969550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.969593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.969635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.970081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.970136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.970197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.970238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.970279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.970557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.970574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.972179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.972223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.972265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.972306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.972614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.972672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.972713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.972754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.972795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.973066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.973082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.975158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.975204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.975262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.975319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.975816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.975865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.975908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.975952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.975994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.976389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.976404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.977946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.977992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.978039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.978081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.978361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.978415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.978457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.978503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.978552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.978824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.978840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.980695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.980741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.980787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.980830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.981213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.981264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.794 [2024-07-12 11:00:46.981306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.795 [2024-07-12 11:00:46.981347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.795 [2024-07-12 11:00:46.981388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.795 [2024-07-12 11:00:46.981821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:11.795 [2024-07-12 11:00:46.981838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.983429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.984570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.984615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.984667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.984937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.984992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.985041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.986671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.986731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.987004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.987020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.989448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.989503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.989546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.990562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.990875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.990932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.992461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.992512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.992553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.992824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.992840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.994532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.994577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.996108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.996153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.996564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.996968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.997012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.997054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.997441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.997887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.997905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:46.999587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.001098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.001145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.001187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.001467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.001525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.001568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.001608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.003008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.003284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.003301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.005855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.005907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.005951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.005993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.006424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.006478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.006526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.006568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.007847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.008122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.008138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.011285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.011336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.011378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.011419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.011698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.011756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.011799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.011840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.012443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.012921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.012938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.016343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.016395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.016441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.016498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.016772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.016822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.016881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.016922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.018218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.018552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.018568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.020492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.020545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.020587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.020629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.021061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.021112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.021155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.021210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.021608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.055 [2024-07-12 11:00:47.022048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.022064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.024246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.024296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.024343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.025639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.025916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.025975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.026017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.026061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.027590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.027921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.027938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.030344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.030390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.030432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.032061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.032341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.032396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.032439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.032493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.034143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.034419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.034434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.036123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.036171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.036212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.037395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.037777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.037840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.037883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.037927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.038313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.038768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.038785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.040580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.040625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.040667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.042162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.042584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.042648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.042693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.042735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.044032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.044307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.044323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.046239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.046290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.046689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.046732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.047167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.047220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.047265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.047307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.049003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.049298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.049314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.050955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.051001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.052656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.052704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.052978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.053035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.053081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.053122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.054279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.054693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.054710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.058199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.058251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.058294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.059704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.060029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.060090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.060133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.061792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.061837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.062110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.062130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.064206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.064605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.064650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.065606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.065924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.065983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.067521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.067567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.069090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.069383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.069399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.070999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.071744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.071802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.072190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.072621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.072689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.073079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.073123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.073908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.056 [2024-07-12 11:00:47.074213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.057 [2024-07-12 11:00:47.074229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.057 [2024-07-12 11:00:47.075870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.057 [2024-07-12 11:00:47.075944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.057 [2024-07-12 11:00:47.077837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.057 [2024-07-12 11:00:47.077888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.057 [2024-07-12 11:00:47.079420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.057 [2024-07-12 11:00:47.079740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.057 [2024-07-12 11:00:47.082372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.057 [2024-07-12 11:00:47.082467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:12.622 00:34:12.622 Latency(us) 00:34:12.622 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:12.622 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:12.622 Verification LBA range: start 0x0 length 0x100 00:34:12.622 crypto_ram : 6.10 41.99 2.62 0.00 0.00 2958569.29 313660.99 2582232.38 00:34:12.622 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:12.622 Verification LBA range: start 0x100 length 0x100 00:34:12.622 crypto_ram : 6.14 41.71 2.61 0.00 0.00 2983034.88 289954.06 2713532.33 00:34:12.622 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:12.622 Verification LBA range: start 0x0 length 0x100 00:34:12.622 crypto_ram1 : 6.10 41.98 2.62 0.00 0.00 2851777.67 313660.99 2363399.12 00:34:12.622 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:12.622 Verification LBA range: start 0x100 length 0x100 00:34:12.622 crypto_ram1 : 6.14 41.71 2.61 0.00 0.00 2876211.20 288130.45 2494699.07 00:34:12.622 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:12.622 Verification LBA range: start 0x0 length 0x100 00:34:12.622 crypto_ram2 : 5.65 269.53 16.85 0.00 0.00 424358.07 82974.27 678383.08 00:34:12.622 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:12.622 Verification LBA range: start 0x100 length 0x100 00:34:12.622 crypto_ram2 : 5.61 254.93 15.93 0.00 0.00 447070.65 38751.72 696619.19 00:34:12.622 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:12.622 Verification LBA range: start 0x0 length 0x100 00:34:12.622 crypto_ram3 : 5.75 278.33 17.40 0.00 0.00 395965.45 37156.06 492374.82 00:34:12.622 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:12.622 Verification LBA range: start 0x100 length 0x100 00:34:12.622 crypto_ram3 : 5.74 267.48 16.72 0.00 0.00 413083.97 37611.97 344662.37 00:34:12.622 =================================================================================================================== 00:34:12.622 Total : 1237.66 77.35 0.00 0.00 779052.14 37156.06 2713532.33 00:34:13.190 00:34:13.190 real 0m9.330s 00:34:13.190 user 0m17.712s 00:34:13.190 sys 0m0.469s 00:34:13.190 11:00:48 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:13.190 11:00:48 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:34:13.190 ************************************ 00:34:13.190 END TEST bdev_verify_big_io 00:34:13.190 ************************************ 00:34:13.190 11:00:48 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:13.190 11:00:48 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:13.190 11:00:48 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:13.190 11:00:48 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:13.190 11:00:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:13.190 ************************************ 00:34:13.190 START TEST bdev_write_zeroes 00:34:13.190 ************************************ 00:34:13.190 11:00:48 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:13.190 [2024-07-12 11:00:48.362722] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:13.190 [2024-07-12 11:00:48.362788] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2221054 ] 00:34:13.449 [2024-07-12 11:00:48.494587] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:13.449 [2024-07-12 11:00:48.599794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:13.449 [2024-07-12 11:00:48.621135] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:13.449 [2024-07-12 11:00:48.629149] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:13.449 [2024-07-12 11:00:48.637167] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:13.725 [2024-07-12 11:00:48.757995] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:16.258 [2024-07-12 11:00:50.966478] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:16.258 [2024-07-12 11:00:50.966554] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:16.258 [2024-07-12 11:00:50.966570] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:16.258 [2024-07-12 11:00:50.974501] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:16.258 [2024-07-12 11:00:50.974521] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:16.258 [2024-07-12 11:00:50.974533] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:16.258 [2024-07-12 11:00:50.982521] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:16.258 [2024-07-12 11:00:50.982539] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:16.258 [2024-07-12 11:00:50.982550] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:16.258 [2024-07-12 11:00:50.990537] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:16.258 [2024-07-12 11:00:50.990554] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:16.258 [2024-07-12 11:00:50.990566] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:16.258 Running I/O for 1 seconds... 00:34:17.195 00:34:17.195 Latency(us) 00:34:17.195 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:17.195 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:17.195 crypto_ram : 1.03 2005.75 7.83 0.00 0.00 63304.59 5584.81 76591.64 00:34:17.195 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:17.195 crypto_ram1 : 1.03 2011.29 7.86 0.00 0.00 62770.49 5584.81 71120.81 00:34:17.195 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:17.195 crypto_ram2 : 1.02 15440.71 60.32 0.00 0.00 8162.92 2450.48 10770.70 00:34:17.195 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:17.195 crypto_ram3 : 1.02 15472.96 60.44 0.00 0.00 8119.70 2464.72 8548.17 00:34:17.195 =================================================================================================================== 00:34:17.195 Total : 34930.71 136.45 0.00 0.00 14481.55 2450.48 76591.64 00:34:17.456 00:34:17.456 real 0m4.190s 00:34:17.456 user 0m3.753s 00:34:17.456 sys 0m0.387s 00:34:17.456 11:00:52 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:17.456 11:00:52 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:17.456 ************************************ 00:34:17.456 END TEST bdev_write_zeroes 00:34:17.456 ************************************ 00:34:17.456 11:00:52 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:17.456 11:00:52 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:17.456 11:00:52 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:17.456 11:00:52 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:17.456 11:00:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:17.456 ************************************ 00:34:17.456 START TEST bdev_json_nonenclosed 00:34:17.456 ************************************ 00:34:17.456 11:00:52 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:17.456 [2024-07-12 11:00:52.637604] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:17.456 [2024-07-12 11:00:52.637671] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2221623 ] 00:34:17.713 [2024-07-12 11:00:52.766318] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:17.713 [2024-07-12 11:00:52.865820] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:17.713 [2024-07-12 11:00:52.865895] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:17.713 [2024-07-12 11:00:52.865916] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:17.713 [2024-07-12 11:00:52.865929] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:17.971 00:34:17.971 real 0m0.399s 00:34:17.971 user 0m0.242s 00:34:17.971 sys 0m0.153s 00:34:17.971 11:00:52 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:34:17.971 11:00:52 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:17.971 11:00:52 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:17.971 ************************************ 00:34:17.971 END TEST bdev_json_nonenclosed 00:34:17.971 ************************************ 00:34:17.971 11:00:53 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:34:17.971 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:34:17.971 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:17.971 11:00:53 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:17.971 11:00:53 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:17.971 11:00:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:17.971 ************************************ 00:34:17.971 START TEST bdev_json_nonarray 00:34:17.971 ************************************ 00:34:17.971 11:00:53 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:17.971 [2024-07-12 11:00:53.114408] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:17.971 [2024-07-12 11:00:53.114470] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2221783 ] 00:34:18.230 [2024-07-12 11:00:53.245044] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:18.230 [2024-07-12 11:00:53.347144] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:18.230 [2024-07-12 11:00:53.347216] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:18.230 [2024-07-12 11:00:53.347292] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:18.230 [2024-07-12 11:00:53.347305] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:18.488 00:34:18.488 real 0m0.400s 00:34:18.488 user 0m0.237s 00:34:18.488 sys 0m0.160s 00:34:18.488 11:00:53 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:34:18.488 11:00:53 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:18.488 11:00:53 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:18.488 ************************************ 00:34:18.488 END TEST bdev_json_nonarray 00:34:18.488 ************************************ 00:34:18.488 11:00:53 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:34:18.488 11:00:53 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:34:18.488 00:34:18.488 real 1m11.974s 00:34:18.488 user 2m39.828s 00:34:18.488 sys 0m8.976s 00:34:18.488 11:00:53 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:18.488 11:00:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:18.488 ************************************ 00:34:18.488 END TEST blockdev_crypto_qat 00:34:18.488 ************************************ 00:34:18.488 11:00:53 -- common/autotest_common.sh@1142 -- # return 0 00:34:18.488 11:00:53 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:34:18.488 11:00:53 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:18.488 11:00:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:18.488 11:00:53 -- common/autotest_common.sh@10 -- # set +x 00:34:18.488 ************************************ 00:34:18.488 START TEST chaining 00:34:18.488 ************************************ 00:34:18.488 11:00:53 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:34:18.746 * Looking for test storage... 00:34:18.746 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:18.746 11:00:53 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@7 -- # uname -s 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:34:18.746 11:00:53 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:34:18.746 11:00:53 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:34:18.746 11:00:53 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:34:18.746 11:00:53 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:18.746 11:00:53 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:18.746 11:00:53 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:18.746 11:00:53 chaining -- paths/export.sh@5 -- # export PATH 00:34:18.746 11:00:53 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@47 -- # : 0 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:34:18.746 11:00:53 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:34:18.746 11:00:53 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:34:18.746 11:00:53 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:34:18.746 11:00:53 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:34:18.746 11:00:53 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:34:18.746 11:00:53 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:34:18.746 11:00:53 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:18.747 11:00:53 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:18.747 11:00:53 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:18.747 11:00:53 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:34:18.747 11:00:53 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:34:18.747 11:00:53 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:34:18.747 11:00:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@296 -- # e810=() 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@297 -- # x722=() 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@298 -- # mlx=() 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:34:26.859 11:01:01 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@336 -- # return 1 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:34:26.860 WARNING: No supported devices were found, fallback requested for tcp test 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:34:26.860 Cannot find device "nvmf_tgt_br" 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@155 -- # true 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:34:26.860 Cannot find device "nvmf_tgt_br2" 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@156 -- # true 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:34:26.860 Cannot find device "nvmf_tgt_br" 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@158 -- # true 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:34:26.860 Cannot find device "nvmf_tgt_br2" 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@159 -- # true 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:34:26.860 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@162 -- # true 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:34:26.860 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@163 -- # true 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:34:26.860 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:34:26.860 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.108 ms 00:34:26.860 00:34:26.860 --- 10.0.0.2 ping statistics --- 00:34:26.860 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:26.860 rtt min/avg/max/mdev = 0.108/0.108/0.108/0.000 ms 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:34:26.860 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:34:26.860 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.084 ms 00:34:26.860 00:34:26.860 --- 10.0.0.3 ping statistics --- 00:34:26.860 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:26.860 rtt min/avg/max/mdev = 0.084/0.084/0.084/0.000 ms 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:34:26.860 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:34:26.860 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.025 ms 00:34:26.860 00:34:26.860 --- 10.0.0.1 ping statistics --- 00:34:26.860 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:26.860 rtt min/avg/max/mdev = 0.025/0.025/0.025/0.000 ms 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@433 -- # return 0 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:34:26.860 11:01:01 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:34:26.860 11:01:01 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:34:26.860 11:01:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@481 -- # nvmfpid=2225459 00:34:26.860 11:01:01 chaining -- nvmf/common.sh@482 -- # waitforlisten 2225459 00:34:26.860 11:01:01 chaining -- common/autotest_common.sh@829 -- # '[' -z 2225459 ']' 00:34:26.860 11:01:01 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:26.860 11:01:01 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:26.860 11:01:01 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:26.860 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:26.860 11:01:01 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:26.860 11:01:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:26.860 [2024-07-12 11:01:01.781680] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:26.860 [2024-07-12 11:01:01.781745] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:26.860 [2024-07-12 11:01:01.907601] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:26.860 [2024-07-12 11:01:02.003588] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:34:26.860 [2024-07-12 11:01:02.003635] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:34:26.860 [2024-07-12 11:01:02.003649] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:34:26.860 [2024-07-12 11:01:02.003662] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:34:26.860 [2024-07-12 11:01:02.003673] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:34:26.860 [2024-07-12 11:01:02.003701] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:27.795 11:01:02 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:27.795 11:01:02 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@69 -- # mktemp 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.uK2gW6wA1y 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@69 -- # mktemp 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.VJA1XoUMTU 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:27.795 malloc0 00:34:27.795 true 00:34:27.795 true 00:34:27.795 [2024-07-12 11:01:02.743371] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:27.795 crypto0 00:34:27.795 [2024-07-12 11:01:02.751398] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:27.795 crypto1 00:34:27.795 [2024-07-12 11:01:02.759530] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:34:27.795 [2024-07-12 11:01:02.775748] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@85 -- # update_stats 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:27.795 11:01:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:27.795 11:01:02 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.uK2gW6wA1y bs=1K count=64 00:34:28.053 64+0 records in 00:34:28.053 64+0 records out 00:34:28.053 65536 bytes (66 kB, 64 KiB) copied, 0.000886656 s, 73.9 MB/s 00:34:28.053 11:01:02 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.uK2gW6wA1y --ob Nvme0n1 --bs 65536 --count 1 00:34:28.053 11:01:02 chaining -- bdev/chaining.sh@25 -- # local config 00:34:28.053 11:01:02 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:28.053 11:01:02 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:28.053 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:28.053 11:01:03 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:28.053 "subsystems": [ 00:34:28.053 { 00:34:28.053 "subsystem": "bdev", 00:34:28.053 "config": [ 00:34:28.053 { 00:34:28.053 "method": "bdev_nvme_attach_controller", 00:34:28.053 "params": { 00:34:28.053 "trtype": "tcp", 00:34:28.053 "adrfam": "IPv4", 00:34:28.053 "name": "Nvme0", 00:34:28.053 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:28.053 "traddr": "10.0.0.2", 00:34:28.053 "trsvcid": "4420" 00:34:28.053 } 00:34:28.053 }, 00:34:28.053 { 00:34:28.053 "method": "bdev_set_options", 00:34:28.053 "params": { 00:34:28.053 "bdev_auto_examine": false 00:34:28.053 } 00:34:28.053 } 00:34:28.053 ] 00:34:28.053 } 00:34:28.053 ] 00:34:28.053 }' 00:34:28.053 11:01:03 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.uK2gW6wA1y --ob Nvme0n1 --bs 65536 --count 1 00:34:28.053 11:01:03 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:28.053 "subsystems": [ 00:34:28.053 { 00:34:28.053 "subsystem": "bdev", 00:34:28.053 "config": [ 00:34:28.053 { 00:34:28.053 "method": "bdev_nvme_attach_controller", 00:34:28.053 "params": { 00:34:28.053 "trtype": "tcp", 00:34:28.053 "adrfam": "IPv4", 00:34:28.053 "name": "Nvme0", 00:34:28.053 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:28.053 "traddr": "10.0.0.2", 00:34:28.053 "trsvcid": "4420" 00:34:28.053 } 00:34:28.053 }, 00:34:28.053 { 00:34:28.053 "method": "bdev_set_options", 00:34:28.053 "params": { 00:34:28.053 "bdev_auto_examine": false 00:34:28.053 } 00:34:28.053 } 00:34:28.053 ] 00:34:28.053 } 00:34:28.053 ] 00:34:28.053 }' 00:34:28.053 [2024-07-12 11:01:03.085083] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:28.053 [2024-07-12 11:01:03.085149] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2225679 ] 00:34:28.053 [2024-07-12 11:01:03.213349] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:28.311 [2024-07-12 11:01:03.315617] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:28.570  Copying: 64/64 [kB] (average 12 MBps) 00:34:28.570 00:34:28.570 11:01:03 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:34:28.570 11:01:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.570 11:01:03 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:28.570 11:01:03 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:28.570 11:01:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:28.570 11:01:03 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:28.570 11:01:03 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:28.570 11:01:03 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:28.570 11:01:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.571 11:01:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:28.830 11:01:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@96 -- # update_stats 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:28.830 11:01:03 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:28.831 11:01:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.831 11:01:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:28.831 11:01:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:28.831 11:01:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:28.831 11:01:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.831 11:01:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:28.831 11:01:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.831 11:01:04 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:34:28.831 11:01:04 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:28.831 11:01:04 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:29.090 11:01:04 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:29.090 11:01:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:29.090 11:01:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:29.090 11:01:04 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:29.090 11:01:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:29.090 11:01:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.VJA1XoUMTU --ib Nvme0n1 --bs 65536 --count 1 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@25 -- # local config 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:29.090 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:29.090 "subsystems": [ 00:34:29.090 { 00:34:29.090 "subsystem": "bdev", 00:34:29.090 "config": [ 00:34:29.090 { 00:34:29.090 "method": "bdev_nvme_attach_controller", 00:34:29.090 "params": { 00:34:29.090 "trtype": "tcp", 00:34:29.090 "adrfam": "IPv4", 00:34:29.090 "name": "Nvme0", 00:34:29.090 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:29.090 "traddr": "10.0.0.2", 00:34:29.090 "trsvcid": "4420" 00:34:29.090 } 00:34:29.090 }, 00:34:29.090 { 00:34:29.090 "method": "bdev_set_options", 00:34:29.090 "params": { 00:34:29.090 "bdev_auto_examine": false 00:34:29.090 } 00:34:29.090 } 00:34:29.090 ] 00:34:29.090 } 00:34:29.090 ] 00:34:29.090 }' 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.VJA1XoUMTU --ib Nvme0n1 --bs 65536 --count 1 00:34:29.090 11:01:04 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:29.090 "subsystems": [ 00:34:29.090 { 00:34:29.090 "subsystem": "bdev", 00:34:29.090 "config": [ 00:34:29.090 { 00:34:29.090 "method": "bdev_nvme_attach_controller", 00:34:29.090 "params": { 00:34:29.090 "trtype": "tcp", 00:34:29.090 "adrfam": "IPv4", 00:34:29.090 "name": "Nvme0", 00:34:29.090 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:29.090 "traddr": "10.0.0.2", 00:34:29.090 "trsvcid": "4420" 00:34:29.090 } 00:34:29.090 }, 00:34:29.090 { 00:34:29.090 "method": "bdev_set_options", 00:34:29.090 "params": { 00:34:29.090 "bdev_auto_examine": false 00:34:29.090 } 00:34:29.090 } 00:34:29.090 ] 00:34:29.090 } 00:34:29.090 ] 00:34:29.090 }' 00:34:29.090 [2024-07-12 11:01:04.209507] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:29.090 [2024-07-12 11:01:04.209572] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2225889 ] 00:34:29.348 [2024-07-12 11:01:04.340472] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:29.349 [2024-07-12 11:01:04.437854] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:29.865  Copying: 64/64 [kB] (average 20 MBps) 00:34:29.865 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:29.865 11:01:04 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:29.865 11:01:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:29.865 11:01:05 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:34:29.865 11:01:05 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.uK2gW6wA1y /tmp/tmp.VJA1XoUMTU 00:34:29.865 11:01:05 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:29.865 11:01:05 chaining -- bdev/chaining.sh@25 -- # local config 00:34:29.865 11:01:05 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:29.865 11:01:05 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:29.865 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:30.124 11:01:05 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:30.124 "subsystems": [ 00:34:30.124 { 00:34:30.124 "subsystem": "bdev", 00:34:30.124 "config": [ 00:34:30.124 { 00:34:30.124 "method": "bdev_nvme_attach_controller", 00:34:30.124 "params": { 00:34:30.124 "trtype": "tcp", 00:34:30.124 "adrfam": "IPv4", 00:34:30.124 "name": "Nvme0", 00:34:30.124 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:30.124 "traddr": "10.0.0.2", 00:34:30.124 "trsvcid": "4420" 00:34:30.124 } 00:34:30.124 }, 00:34:30.124 { 00:34:30.124 "method": "bdev_set_options", 00:34:30.124 "params": { 00:34:30.124 "bdev_auto_examine": false 00:34:30.124 } 00:34:30.124 } 00:34:30.124 ] 00:34:30.124 } 00:34:30.124 ] 00:34:30.124 }' 00:34:30.124 11:01:05 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:30.124 11:01:05 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:30.124 "subsystems": [ 00:34:30.124 { 00:34:30.124 "subsystem": "bdev", 00:34:30.124 "config": [ 00:34:30.124 { 00:34:30.124 "method": "bdev_nvme_attach_controller", 00:34:30.124 "params": { 00:34:30.124 "trtype": "tcp", 00:34:30.124 "adrfam": "IPv4", 00:34:30.124 "name": "Nvme0", 00:34:30.124 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:30.124 "traddr": "10.0.0.2", 00:34:30.124 "trsvcid": "4420" 00:34:30.124 } 00:34:30.124 }, 00:34:30.124 { 00:34:30.124 "method": "bdev_set_options", 00:34:30.124 "params": { 00:34:30.124 "bdev_auto_examine": false 00:34:30.124 } 00:34:30.124 } 00:34:30.124 ] 00:34:30.124 } 00:34:30.124 ] 00:34:30.124 }' 00:34:30.124 [2024-07-12 11:01:05.135068] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:30.124 [2024-07-12 11:01:05.135137] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2225934 ] 00:34:30.124 [2024-07-12 11:01:05.263777] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:30.383 [2024-07-12 11:01:05.365518] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:30.641  Copying: 64/64 [kB] (average 15 MBps) 00:34:30.641 00:34:30.641 11:01:05 chaining -- bdev/chaining.sh@106 -- # update_stats 00:34:30.641 11:01:05 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:30.641 11:01:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:30.641 11:01:05 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:30.641 11:01:05 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:30.641 11:01:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:30.641 11:01:05 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:30.642 11:01:05 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:30.642 11:01:05 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:30.642 11:01:05 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:30.642 11:01:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:30.900 11:01:05 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:30.900 11:01:05 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:30.900 11:01:06 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:30.900 11:01:06 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.uK2gW6wA1y --ob Nvme0n1 --bs 4096 --count 16 00:34:30.900 11:01:06 chaining -- bdev/chaining.sh@25 -- # local config 00:34:30.900 11:01:06 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:30.900 11:01:06 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:30.900 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:30.900 11:01:06 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:30.900 "subsystems": [ 00:34:30.900 { 00:34:30.900 "subsystem": "bdev", 00:34:30.900 "config": [ 00:34:30.900 { 00:34:30.900 "method": "bdev_nvme_attach_controller", 00:34:30.900 "params": { 00:34:30.900 "trtype": "tcp", 00:34:30.900 "adrfam": "IPv4", 00:34:30.900 "name": "Nvme0", 00:34:30.900 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:30.900 "traddr": "10.0.0.2", 00:34:30.900 "trsvcid": "4420" 00:34:30.900 } 00:34:30.900 }, 00:34:30.900 { 00:34:30.900 "method": "bdev_set_options", 00:34:30.900 "params": { 00:34:30.900 "bdev_auto_examine": false 00:34:30.900 } 00:34:30.900 } 00:34:30.900 ] 00:34:30.900 } 00:34:30.900 ] 00:34:30.900 }' 00:34:30.900 11:01:06 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.uK2gW6wA1y --ob Nvme0n1 --bs 4096 --count 16 00:34:30.900 11:01:06 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:30.900 "subsystems": [ 00:34:30.900 { 00:34:30.900 "subsystem": "bdev", 00:34:30.900 "config": [ 00:34:30.900 { 00:34:30.901 "method": "bdev_nvme_attach_controller", 00:34:30.901 "params": { 00:34:30.901 "trtype": "tcp", 00:34:30.901 "adrfam": "IPv4", 00:34:30.901 "name": "Nvme0", 00:34:30.901 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:30.901 "traddr": "10.0.0.2", 00:34:30.901 "trsvcid": "4420" 00:34:30.901 } 00:34:30.901 }, 00:34:30.901 { 00:34:30.901 "method": "bdev_set_options", 00:34:30.901 "params": { 00:34:30.901 "bdev_auto_examine": false 00:34:30.901 } 00:34:30.901 } 00:34:30.901 ] 00:34:30.901 } 00:34:30.901 ] 00:34:30.901 }' 00:34:31.159 [2024-07-12 11:01:06.130605] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:31.159 [2024-07-12 11:01:06.130677] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2226121 ] 00:34:31.159 [2024-07-12 11:01:06.259992] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:31.417 [2024-07-12 11:01:06.367125] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:31.677  Copying: 64/64 [kB] (average 15 MBps) 00:34:31.677 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:31.677 11:01:06 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:31.677 11:01:06 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:31.677 11:01:06 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:31.677 11:01:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:31.677 11:01:06 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:31.677 11:01:06 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:31.936 11:01:06 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:31.936 11:01:06 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:31.936 11:01:06 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:31.936 11:01:06 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:31.936 11:01:06 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:31.936 11:01:06 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:31.936 11:01:06 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@114 -- # update_stats 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:31.936 11:01:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:31.936 11:01:07 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:31.936 11:01:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:31.936 11:01:07 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:31.936 11:01:07 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:31.936 11:01:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:31.936 11:01:07 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:31.936 11:01:07 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:31.937 11:01:07 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:31.937 11:01:07 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:31.937 11:01:07 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:31.937 11:01:07 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:31.937 11:01:07 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:31.937 11:01:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:31.937 11:01:07 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:32.195 11:01:07 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:32.195 11:01:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:32.195 11:01:07 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@117 -- # : 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.VJA1XoUMTU --ib Nvme0n1 --bs 4096 --count 16 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@25 -- # local config 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:32.195 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:32.195 "subsystems": [ 00:34:32.195 { 00:34:32.195 "subsystem": "bdev", 00:34:32.195 "config": [ 00:34:32.195 { 00:34:32.195 "method": "bdev_nvme_attach_controller", 00:34:32.195 "params": { 00:34:32.195 "trtype": "tcp", 00:34:32.195 "adrfam": "IPv4", 00:34:32.195 "name": "Nvme0", 00:34:32.195 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:32.195 "traddr": "10.0.0.2", 00:34:32.195 "trsvcid": "4420" 00:34:32.195 } 00:34:32.195 }, 00:34:32.195 { 00:34:32.195 "method": "bdev_set_options", 00:34:32.195 "params": { 00:34:32.195 "bdev_auto_examine": false 00:34:32.195 } 00:34:32.195 } 00:34:32.195 ] 00:34:32.195 } 00:34:32.195 ] 00:34:32.195 }' 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.VJA1XoUMTU --ib Nvme0n1 --bs 4096 --count 16 00:34:32.195 11:01:07 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:32.195 "subsystems": [ 00:34:32.195 { 00:34:32.195 "subsystem": "bdev", 00:34:32.195 "config": [ 00:34:32.195 { 00:34:32.195 "method": "bdev_nvme_attach_controller", 00:34:32.195 "params": { 00:34:32.195 "trtype": "tcp", 00:34:32.195 "adrfam": "IPv4", 00:34:32.195 "name": "Nvme0", 00:34:32.195 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:32.195 "traddr": "10.0.0.2", 00:34:32.195 "trsvcid": "4420" 00:34:32.195 } 00:34:32.195 }, 00:34:32.195 { 00:34:32.195 "method": "bdev_set_options", 00:34:32.195 "params": { 00:34:32.195 "bdev_auto_examine": false 00:34:32.195 } 00:34:32.195 } 00:34:32.195 ] 00:34:32.195 } 00:34:32.195 ] 00:34:32.195 }' 00:34:32.195 [2024-07-12 11:01:07.283267] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:32.195 [2024-07-12 11:01:07.283321] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2226328 ] 00:34:32.453 [2024-07-12 11:01:07.398136] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:32.453 [2024-07-12 11:01:07.496990] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:32.971  Copying: 64/64 [kB] (average 1361 kBps) 00:34:32.971 00:34:32.971 11:01:07 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:34:32.971 11:01:07 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:32.971 11:01:07 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:32.971 11:01:07 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:32.971 11:01:07 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:32.971 11:01:07 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:32.971 11:01:07 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:32.971 11:01:07 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:32.971 11:01:07 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:32.971 11:01:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:32.971 11:01:07 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:32.971 11:01:08 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:32.971 11:01:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:32.971 11:01:08 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:32.971 11:01:08 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:32.971 11:01:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:32.971 11:01:08 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:32.971 11:01:08 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:32.971 11:01:08 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:32.971 11:01:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:32.971 11:01:08 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:33.231 11:01:08 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:34:33.231 11:01:08 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.uK2gW6wA1y /tmp/tmp.VJA1XoUMTU 00:34:33.231 11:01:08 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:34:33.231 11:01:08 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:34:33.231 11:01:08 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.uK2gW6wA1y /tmp/tmp.VJA1XoUMTU 00:34:33.231 11:01:08 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@117 -- # sync 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@120 -- # set +e 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:34:33.231 rmmod nvme_tcp 00:34:33.231 rmmod nvme_fabrics 00:34:33.231 rmmod nvme_keyring 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@124 -- # set -e 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@125 -- # return 0 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@489 -- # '[' -n 2225459 ']' 00:34:33.231 11:01:08 chaining -- nvmf/common.sh@490 -- # killprocess 2225459 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@948 -- # '[' -z 2225459 ']' 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@952 -- # kill -0 2225459 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@953 -- # uname 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2225459 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2225459' 00:34:33.231 killing process with pid 2225459 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@967 -- # kill 2225459 00:34:33.231 11:01:08 chaining -- common/autotest_common.sh@972 -- # wait 2225459 00:34:33.490 11:01:08 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:34:33.490 11:01:08 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:34:33.490 11:01:08 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:34:33.490 11:01:08 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:34:33.490 11:01:08 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:34:33.490 11:01:08 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:33.490 11:01:08 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:33.490 11:01:08 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:33.490 11:01:08 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:34:33.490 11:01:08 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:34:33.490 11:01:08 chaining -- bdev/chaining.sh@132 -- # bperfpid=2226544 00:34:33.490 11:01:08 chaining -- bdev/chaining.sh@134 -- # waitforlisten 2226544 00:34:33.490 11:01:08 chaining -- common/autotest_common.sh@829 -- # '[' -z 2226544 ']' 00:34:33.490 11:01:08 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:33.490 11:01:08 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:33.490 11:01:08 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:33.490 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:33.490 11:01:08 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:33.490 11:01:08 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:33.490 11:01:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:33.490 [2024-07-12 11:01:08.657084] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:33.490 [2024-07-12 11:01:08.657149] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2226544 ] 00:34:33.749 [2024-07-12 11:01:08.786859] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:33.749 [2024-07-12 11:01:08.889758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:34.686 11:01:09 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:34.686 11:01:09 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:34.686 11:01:09 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:34:34.686 11:01:09 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:34.686 11:01:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:34.686 malloc0 00:34:34.686 true 00:34:34.686 true 00:34:34.686 [2024-07-12 11:01:09.727217] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:34.686 crypto0 00:34:34.686 [2024-07-12 11:01:09.735242] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:34.686 crypto1 00:34:34.686 11:01:09 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:34.686 11:01:09 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:34.686 Running I/O for 5 seconds... 00:34:40.009 00:34:40.009 Latency(us) 00:34:40.009 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:40.009 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:40.009 Verification LBA range: start 0x0 length 0x2000 00:34:40.009 crypto1 : 5.01 11437.20 44.68 0.00 0.00 22322.10 6382.64 15956.59 00:34:40.009 =================================================================================================================== 00:34:40.009 Total : 11437.20 44.68 0.00 0.00 22322.10 6382.64 15956.59 00:34:40.009 0 00:34:40.009 11:01:14 chaining -- bdev/chaining.sh@146 -- # killprocess 2226544 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@948 -- # '[' -z 2226544 ']' 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@952 -- # kill -0 2226544 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@953 -- # uname 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2226544 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2226544' 00:34:40.009 killing process with pid 2226544 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@967 -- # kill 2226544 00:34:40.009 Received shutdown signal, test time was about 5.000000 seconds 00:34:40.009 00:34:40.009 Latency(us) 00:34:40.009 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:40.009 =================================================================================================================== 00:34:40.009 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:40.009 11:01:14 chaining -- common/autotest_common.sh@972 -- # wait 2226544 00:34:40.009 11:01:15 chaining -- bdev/chaining.sh@152 -- # bperfpid=2227415 00:34:40.009 11:01:15 chaining -- bdev/chaining.sh@154 -- # waitforlisten 2227415 00:34:40.009 11:01:15 chaining -- common/autotest_common.sh@829 -- # '[' -z 2227415 ']' 00:34:40.009 11:01:15 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:40.009 11:01:15 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:40.009 11:01:15 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:40.009 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:40.009 11:01:15 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:40.009 11:01:15 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:40.009 11:01:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:40.267 [2024-07-12 11:01:15.282111] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:40.267 [2024-07-12 11:01:15.282245] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2227415 ] 00:34:40.525 [2024-07-12 11:01:15.478603] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:40.525 [2024-07-12 11:01:15.578999] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:41.473 11:01:16 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:41.473 11:01:16 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:41.473 11:01:16 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:34:41.473 11:01:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:41.473 11:01:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:41.473 malloc0 00:34:41.473 true 00:34:41.473 true 00:34:41.473 [2024-07-12 11:01:16.575925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:34:41.473 [2024-07-12 11:01:16.575974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:41.473 [2024-07-12 11:01:16.575995] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c3730 00:34:41.473 [2024-07-12 11:01:16.576008] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:41.473 [2024-07-12 11:01:16.577085] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:41.473 [2024-07-12 11:01:16.577109] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:34:41.473 pt0 00:34:41.473 [2024-07-12 11:01:16.583957] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:41.473 crypto0 00:34:41.473 [2024-07-12 11:01:16.591975] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:41.473 crypto1 00:34:41.473 11:01:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:41.473 11:01:16 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:41.737 Running I/O for 5 seconds... 00:34:47.002 00:34:47.002 Latency(us) 00:34:47.002 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:47.002 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:47.002 Verification LBA range: start 0x0 length 0x2000 00:34:47.002 crypto1 : 5.01 8943.45 34.94 0.00 0.00 28541.65 865.50 17324.30 00:34:47.002 =================================================================================================================== 00:34:47.002 Total : 8943.45 34.94 0.00 0.00 28541.65 865.50 17324.30 00:34:47.002 0 00:34:47.002 11:01:21 chaining -- bdev/chaining.sh@167 -- # killprocess 2227415 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@948 -- # '[' -z 2227415 ']' 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@952 -- # kill -0 2227415 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@953 -- # uname 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2227415 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2227415' 00:34:47.002 killing process with pid 2227415 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@967 -- # kill 2227415 00:34:47.002 Received shutdown signal, test time was about 5.000000 seconds 00:34:47.002 00:34:47.002 Latency(us) 00:34:47.002 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:47.002 =================================================================================================================== 00:34:47.002 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@972 -- # wait 2227415 00:34:47.002 11:01:21 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:34:47.002 11:01:21 chaining -- bdev/chaining.sh@170 -- # killprocess 2227415 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@948 -- # '[' -z 2227415 ']' 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@952 -- # kill -0 2227415 00:34:47.002 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (2227415) - No such process 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 2227415 is not found' 00:34:47.002 Process with pid 2227415 is not found 00:34:47.002 11:01:21 chaining -- bdev/chaining.sh@171 -- # wait 2227415 00:34:47.002 11:01:21 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:34:47.002 11:01:21 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:34:47.002 11:01:21 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:34:47.002 11:01:21 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:34:47.002 11:01:21 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:34:47.002 11:01:21 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:34:47.002 11:01:21 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:47.002 11:01:21 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:34:47.002 11:01:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@296 -- # e810=() 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@297 -- # x722=() 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@298 -- # mlx=() 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@336 -- # return 1 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:34:47.002 WARNING: No supported devices were found, fallback requested for tcp test 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:34:47.002 Cannot find device "nvmf_tgt_br" 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@155 -- # true 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:34:47.002 Cannot find device "nvmf_tgt_br2" 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@156 -- # true 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:34:47.002 Cannot find device "nvmf_tgt_br" 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@158 -- # true 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:34:47.002 Cannot find device "nvmf_tgt_br2" 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@159 -- # true 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:34:47.002 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@162 -- # true 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:34:47.002 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@163 -- # true 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:34:47.002 11:01:22 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:34:47.260 11:01:22 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:34:47.261 11:01:22 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:34:47.261 11:01:22 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:34:47.261 11:01:22 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:34:47.261 11:01:22 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:34:47.261 11:01:22 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:34:47.261 11:01:22 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:34:47.261 11:01:22 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:34:47.261 11:01:22 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:34:47.261 11:01:22 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:34:47.519 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:34:47.519 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.081 ms 00:34:47.519 00:34:47.519 --- 10.0.0.2 ping statistics --- 00:34:47.519 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:47.519 rtt min/avg/max/mdev = 0.081/0.081/0.081/0.000 ms 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:34:47.519 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:34:47.519 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.074 ms 00:34:47.519 00:34:47.519 --- 10.0.0.3 ping statistics --- 00:34:47.519 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:47.519 rtt min/avg/max/mdev = 0.074/0.074/0.074/0.000 ms 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:34:47.519 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:34:47.519 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.039 ms 00:34:47.519 00:34:47.519 --- 10.0.0.1 ping statistics --- 00:34:47.519 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:47.519 rtt min/avg/max/mdev = 0.039/0.039/0.039/0.000 ms 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@433 -- # return 0 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:34:47.519 11:01:22 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:34:47.519 11:01:22 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:34:47.519 11:01:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@481 -- # nvmfpid=2228557 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@482 -- # waitforlisten 2228557 00:34:47.519 11:01:22 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:34:47.519 11:01:22 chaining -- common/autotest_common.sh@829 -- # '[' -z 2228557 ']' 00:34:47.519 11:01:22 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:47.519 11:01:22 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:47.519 11:01:22 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:47.519 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:47.519 11:01:22 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:47.519 11:01:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:47.519 [2024-07-12 11:01:22.644523] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:47.519 [2024-07-12 11:01:22.644591] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:47.776 [2024-07-12 11:01:22.771844] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:47.776 [2024-07-12 11:01:22.875443] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:34:47.776 [2024-07-12 11:01:22.875494] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:34:47.776 [2024-07-12 11:01:22.875509] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:34:47.776 [2024-07-12 11:01:22.875522] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:34:47.776 [2024-07-12 11:01:22.875533] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:34:47.776 [2024-07-12 11:01:22.875562] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:48.708 11:01:23 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:48.708 11:01:23 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:34:48.708 11:01:23 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:48.708 malloc0 00:34:48.708 [2024-07-12 11:01:23.635435] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:34:48.708 [2024-07-12 11:01:23.651641] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:48.708 11:01:23 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:34:48.708 11:01:23 chaining -- bdev/chaining.sh@189 -- # bperfpid=2228754 00:34:48.708 11:01:23 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:48.708 11:01:23 chaining -- bdev/chaining.sh@191 -- # waitforlisten 2228754 /var/tmp/bperf.sock 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@829 -- # '[' -z 2228754 ']' 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:34:48.708 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:48.708 11:01:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:48.708 [2024-07-12 11:01:23.723632] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:48.708 [2024-07-12 11:01:23.723694] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2228754 ] 00:34:48.708 [2024-07-12 11:01:23.862378] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:48.967 [2024-07-12 11:01:23.966442] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:49.532 11:01:24 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:49.532 11:01:24 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:49.532 11:01:24 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:34:49.532 11:01:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:34:50.098 [2024-07-12 11:01:25.075561] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:50.098 nvme0n1 00:34:50.098 true 00:34:50.098 crypto0 00:34:50.098 11:01:25 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:34:50.098 Running I/O for 5 seconds... 00:34:55.362 00:34:55.362 Latency(us) 00:34:55.362 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:55.362 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:55.362 Verification LBA range: start 0x0 length 0x2000 00:34:55.362 crypto0 : 5.02 8237.43 32.18 0.00 0.00 30978.62 4331.07 24048.86 00:34:55.362 =================================================================================================================== 00:34:55.362 Total : 8237.43 32.18 0.00 0.00 30978.62 4331.07 24048.86 00:34:55.362 0 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@205 -- # sequence=82700 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:55.362 11:01:30 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@206 -- # encrypt=41350 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:55.620 11:01:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:55.621 11:01:30 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@207 -- # decrypt=41350 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:55.878 11:01:31 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:56.136 11:01:31 chaining -- bdev/chaining.sh@208 -- # crc32c=82700 00:34:56.136 11:01:31 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:34:56.136 11:01:31 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:34:56.136 11:01:31 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:34:56.136 11:01:31 chaining -- bdev/chaining.sh@214 -- # killprocess 2228754 00:34:56.136 11:01:31 chaining -- common/autotest_common.sh@948 -- # '[' -z 2228754 ']' 00:34:56.136 11:01:31 chaining -- common/autotest_common.sh@952 -- # kill -0 2228754 00:34:56.136 11:01:31 chaining -- common/autotest_common.sh@953 -- # uname 00:34:56.136 11:01:31 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:56.136 11:01:31 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2228754 00:34:56.393 11:01:31 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:56.393 11:01:31 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:56.393 11:01:31 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2228754' 00:34:56.393 killing process with pid 2228754 00:34:56.393 11:01:31 chaining -- common/autotest_common.sh@967 -- # kill 2228754 00:34:56.393 Received shutdown signal, test time was about 5.000000 seconds 00:34:56.393 00:34:56.393 Latency(us) 00:34:56.393 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:56.393 =================================================================================================================== 00:34:56.393 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:56.393 11:01:31 chaining -- common/autotest_common.sh@972 -- # wait 2228754 00:34:56.393 11:01:31 chaining -- bdev/chaining.sh@219 -- # bperfpid=2229750 00:34:56.393 11:01:31 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:34:56.393 11:01:31 chaining -- bdev/chaining.sh@221 -- # waitforlisten 2229750 /var/tmp/bperf.sock 00:34:56.393 11:01:31 chaining -- common/autotest_common.sh@829 -- # '[' -z 2229750 ']' 00:34:56.393 11:01:31 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:34:56.393 11:01:31 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:56.393 11:01:31 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:34:56.393 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:34:56.394 11:01:31 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:56.394 11:01:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:56.652 [2024-07-12 11:01:31.608418] Starting SPDK v24.09-pre git sha1 b3936a144 / DPDK 24.03.0 initialization... 00:34:56.652 [2024-07-12 11:01:31.608506] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2229750 ] 00:34:56.652 [2024-07-12 11:01:31.738097] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:56.652 [2024-07-12 11:01:31.843356] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:57.587 11:01:32 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:57.587 11:01:32 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:57.587 11:01:32 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:34:57.587 11:01:32 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:34:57.844 [2024-07-12 11:01:32.952385] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:57.845 nvme0n1 00:34:57.845 true 00:34:57.845 crypto0 00:34:57.845 11:01:32 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:34:58.103 Running I/O for 5 seconds... 00:35:03.426 00:35:03.426 Latency(us) 00:35:03.426 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:03.426 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:35:03.426 Verification LBA range: start 0x0 length 0x200 00:35:03.426 crypto0 : 5.01 1681.83 105.11 0.00 0.00 18649.97 1111.26 19147.91 00:35:03.426 =================================================================================================================== 00:35:03.426 Total : 1681.83 105.11 0.00 0.00 18649.97 1111.26 19147.91 00:35:03.426 0 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@233 -- # sequence=16838 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:03.426 11:01:38 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@234 -- # encrypt=8419 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@235 -- # decrypt=8419 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:03.685 11:01:38 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:03.943 11:01:38 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:03.943 11:01:38 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:03.943 11:01:38 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:03.943 11:01:39 chaining -- bdev/chaining.sh@236 -- # crc32c=16838 00:35:03.943 11:01:39 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:35:03.943 11:01:39 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:35:03.943 11:01:39 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:35:03.943 11:01:39 chaining -- bdev/chaining.sh@242 -- # killprocess 2229750 00:35:03.943 11:01:39 chaining -- common/autotest_common.sh@948 -- # '[' -z 2229750 ']' 00:35:03.943 11:01:39 chaining -- common/autotest_common.sh@952 -- # kill -0 2229750 00:35:03.943 11:01:39 chaining -- common/autotest_common.sh@953 -- # uname 00:35:03.943 11:01:39 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:04.212 11:01:39 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2229750 00:35:04.212 11:01:39 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:04.212 11:01:39 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:04.212 11:01:39 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2229750' 00:35:04.212 killing process with pid 2229750 00:35:04.212 11:01:39 chaining -- common/autotest_common.sh@967 -- # kill 2229750 00:35:04.212 Received shutdown signal, test time was about 5.000000 seconds 00:35:04.212 00:35:04.212 Latency(us) 00:35:04.212 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:04.212 =================================================================================================================== 00:35:04.212 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:04.212 11:01:39 chaining -- common/autotest_common.sh@972 -- # wait 2229750 00:35:04.474 11:01:39 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@117 -- # sync 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@120 -- # set +e 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:04.474 rmmod nvme_tcp 00:35:04.474 rmmod nvme_fabrics 00:35:04.474 rmmod nvme_keyring 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@124 -- # set -e 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@125 -- # return 0 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@489 -- # '[' -n 2228557 ']' 00:35:04.474 11:01:39 chaining -- nvmf/common.sh@490 -- # killprocess 2228557 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@948 -- # '[' -z 2228557 ']' 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@952 -- # kill -0 2228557 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@953 -- # uname 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2228557 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2228557' 00:35:04.474 killing process with pid 2228557 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@967 -- # kill 2228557 00:35:04.474 11:01:39 chaining -- common/autotest_common.sh@972 -- # wait 2228557 00:35:04.734 11:01:39 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:04.734 11:01:39 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:04.734 11:01:39 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:04.734 11:01:39 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:04.734 11:01:39 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:04.734 11:01:39 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:04.734 11:01:39 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:04.734 11:01:39 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:04.734 11:01:39 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:04.734 11:01:39 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:35:04.734 00:35:04.734 real 0m46.242s 00:35:04.734 user 1m0.242s 00:35:04.734 sys 0m13.290s 00:35:04.734 11:01:39 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:04.734 11:01:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:04.734 ************************************ 00:35:04.734 END TEST chaining 00:35:04.734 ************************************ 00:35:04.734 11:01:39 -- common/autotest_common.sh@1142 -- # return 0 00:35:04.734 11:01:39 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:35:04.734 11:01:39 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:35:04.734 11:01:39 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:35:04.734 11:01:39 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:35:04.734 11:01:39 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:35:04.734 11:01:39 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:35:04.734 11:01:39 -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:04.734 11:01:39 -- common/autotest_common.sh@10 -- # set +x 00:35:04.734 11:01:39 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:35:04.734 11:01:39 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:35:04.734 11:01:39 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:35:04.734 11:01:39 -- common/autotest_common.sh@10 -- # set +x 00:35:08.932 INFO: APP EXITING 00:35:08.932 INFO: killing all VMs 00:35:08.932 INFO: killing vhost app 00:35:08.932 INFO: EXIT DONE 00:35:12.213 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:35:12.213 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:35:12.213 Waiting for block devices as requested 00:35:12.471 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:35:12.471 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:35:12.471 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:35:12.730 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:35:12.730 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:35:12.730 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:35:12.987 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:35:12.987 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:35:12.987 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:35:13.245 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:35:13.245 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:35:13.245 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:35:13.503 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:35:13.503 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:35:13.503 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:35:13.760 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:35:13.760 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:35:17.043 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:35:17.043 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:35:17.310 Cleaning 00:35:17.310 Removing: /var/run/dpdk/spdk0/config 00:35:17.310 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:35:17.310 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:35:17.310 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:35:17.310 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:35:17.310 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:35:17.310 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:35:17.310 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:35:17.310 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:35:17.310 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:35:17.310 Removing: /var/run/dpdk/spdk0/hugepage_info 00:35:17.310 Removing: /dev/shm/nvmf_trace.0 00:35:17.310 Removing: /dev/shm/spdk_tgt_trace.pid1974839 00:35:17.310 Removing: /var/run/dpdk/spdk0 00:35:17.310 Removing: /var/run/dpdk/spdk_pid1973987 00:35:17.310 Removing: /var/run/dpdk/spdk_pid1974839 00:35:17.310 Removing: /var/run/dpdk/spdk_pid1975376 00:35:17.310 Removing: /var/run/dpdk/spdk_pid1976239 00:35:17.310 Removing: /var/run/dpdk/spdk_pid1976814 00:35:17.310 Removing: /var/run/dpdk/spdk_pid1977714 00:35:17.310 Removing: /var/run/dpdk/spdk_pid1977800 00:35:17.310 Removing: /var/run/dpdk/spdk_pid1978074 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1980630 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1981980 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1982326 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1982608 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1982854 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1983253 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1983452 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1983653 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1983873 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1984624 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1987327 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1987522 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1987763 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1987979 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1988123 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1988236 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1988432 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1988679 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1988974 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1989185 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1989379 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1989579 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1989772 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1990040 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1990327 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1990527 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1990730 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1990925 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1991124 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1991369 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1991672 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1991873 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1992077 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1992273 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1992473 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1992726 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1993026 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1993381 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1993603 00:35:17.311 Removing: /var/run/dpdk/spdk_pid1993964 00:35:17.569 Removing: /var/run/dpdk/spdk_pid1994313 00:35:17.569 Removing: /var/run/dpdk/spdk_pid1994548 00:35:17.569 Removing: /var/run/dpdk/spdk_pid1994911 00:35:17.569 Removing: /var/run/dpdk/spdk_pid1995269 00:35:17.569 Removing: /var/run/dpdk/spdk_pid1995347 00:35:17.569 Removing: /var/run/dpdk/spdk_pid1995760 00:35:17.569 Removing: /var/run/dpdk/spdk_pid1996072 00:35:17.569 Removing: /var/run/dpdk/spdk_pid1996441 00:35:17.569 Removing: /var/run/dpdk/spdk_pid1996634 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2000795 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2002526 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2004692 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2005498 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2006555 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2006869 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2006970 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2007076 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2010865 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2011325 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2012313 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2012514 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2017842 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2019476 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2020446 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2024468 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2025986 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2026955 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2031407 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2033805 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2034775 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2044369 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2046432 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2047397 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2057267 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2059993 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2061012 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2070574 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2073801 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2074818 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2085895 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2088339 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2089498 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2100248 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2102850 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2103999 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2115248 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2119066 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2120088 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2121229 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2124448 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2129473 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2131996 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2136506 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2140399 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2145599 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2148488 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2154865 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2157214 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2162988 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2165755 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2172036 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2174286 00:35:17.569 Removing: /var/run/dpdk/spdk_pid2178205 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2178555 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2178936 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2179325 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2179759 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2180475 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2181201 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2181647 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2183246 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2184851 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2186460 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2187916 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2189567 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2191626 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2193221 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2194525 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2195069 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2195598 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2197602 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2199455 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2201307 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2202363 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2203437 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2204000 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2204157 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2204232 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2204580 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2204623 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2205847 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2207358 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2208856 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2209576 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2210453 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2210655 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2210677 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2210864 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2211637 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2212183 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2212670 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2214834 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2217062 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2218861 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2219919 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2221054 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2221623 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2221783 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2225679 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2225889 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2225934 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2226121 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2226328 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2226544 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2227415 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2228754 00:35:17.828 Removing: /var/run/dpdk/spdk_pid2229750 00:35:17.828 Clean 00:35:18.086 11:01:53 -- common/autotest_common.sh@1451 -- # return 0 00:35:18.086 11:01:53 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:35:18.086 11:01:53 -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:18.086 11:01:53 -- common/autotest_common.sh@10 -- # set +x 00:35:18.087 11:01:53 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:35:18.087 11:01:53 -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:18.087 11:01:53 -- common/autotest_common.sh@10 -- # set +x 00:35:18.087 11:01:53 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:35:18.087 11:01:53 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:35:18.087 11:01:53 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:35:18.087 11:01:53 -- spdk/autotest.sh@391 -- # hash lcov 00:35:18.087 11:01:53 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:35:18.087 11:01:53 -- spdk/autotest.sh@393 -- # hostname 00:35:18.087 11:01:53 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:35:18.344 geninfo: WARNING: invalid characters removed from testname! 00:35:50.450 11:02:20 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:50.450 11:02:23 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:51.384 11:02:26 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:54.667 11:02:29 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:57.198 11:02:31 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:59.729 11:02:34 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:02.264 11:02:37 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:36:02.264 11:02:37 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:36:02.264 11:02:37 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:36:02.264 11:02:37 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:36:02.264 11:02:37 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:36:02.264 11:02:37 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:02.264 11:02:37 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:02.264 11:02:37 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:02.264 11:02:37 -- paths/export.sh@5 -- $ export PATH 00:36:02.264 11:02:37 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:02.264 11:02:37 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:02.264 11:02:37 -- common/autobuild_common.sh@444 -- $ date +%s 00:36:02.264 11:02:37 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720774957.XXXXXX 00:36:02.264 11:02:37 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720774957.jG42st 00:36:02.264 11:02:37 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:36:02.264 11:02:37 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:36:02.264 11:02:37 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:36:02.264 11:02:37 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:36:02.264 11:02:37 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:36:02.264 11:02:37 -- common/autobuild_common.sh@460 -- $ get_config_params 00:36:02.264 11:02:37 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:36:02.264 11:02:37 -- common/autotest_common.sh@10 -- $ set +x 00:36:02.264 11:02:37 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:36:02.264 11:02:37 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:36:02.264 11:02:37 -- pm/common@17 -- $ local monitor 00:36:02.264 11:02:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:02.264 11:02:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:02.264 11:02:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:02.264 11:02:37 -- pm/common@21 -- $ date +%s 00:36:02.264 11:02:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:02.264 11:02:37 -- pm/common@21 -- $ date +%s 00:36:02.264 11:02:37 -- pm/common@25 -- $ sleep 1 00:36:02.264 11:02:37 -- pm/common@21 -- $ date +%s 00:36:02.264 11:02:37 -- pm/common@21 -- $ date +%s 00:36:02.264 11:02:37 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720774957 00:36:02.264 11:02:37 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720774957 00:36:02.264 11:02:37 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720774957 00:36:02.264 11:02:37 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720774957 00:36:02.264 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720774957_collect-vmstat.pm.log 00:36:02.264 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720774957_collect-cpu-temp.pm.log 00:36:02.264 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720774957_collect-cpu-load.pm.log 00:36:02.264 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720774957_collect-bmc-pm.bmc.pm.log 00:36:03.199 11:02:38 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:36:03.199 11:02:38 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:36:03.199 11:02:38 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:03.199 11:02:38 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:36:03.199 11:02:38 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:36:03.199 11:02:38 -- spdk/autopackage.sh@19 -- $ timing_finish 00:36:03.199 11:02:38 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:36:03.199 11:02:38 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:36:03.199 11:02:38 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:03.199 11:02:38 -- spdk/autopackage.sh@20 -- $ exit 0 00:36:03.199 11:02:38 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:36:03.199 11:02:38 -- pm/common@29 -- $ signal_monitor_resources TERM 00:36:03.199 11:02:38 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:36:03.199 11:02:38 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:03.199 11:02:38 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:36:03.199 11:02:38 -- pm/common@44 -- $ pid=2240213 00:36:03.199 11:02:38 -- pm/common@50 -- $ kill -TERM 2240213 00:36:03.199 11:02:38 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:03.199 11:02:38 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:36:03.199 11:02:38 -- pm/common@44 -- $ pid=2240215 00:36:03.199 11:02:38 -- pm/common@50 -- $ kill -TERM 2240215 00:36:03.199 11:02:38 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:03.199 11:02:38 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:36:03.199 11:02:38 -- pm/common@44 -- $ pid=2240217 00:36:03.199 11:02:38 -- pm/common@50 -- $ kill -TERM 2240217 00:36:03.199 11:02:38 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:03.200 11:02:38 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:36:03.200 11:02:38 -- pm/common@44 -- $ pid=2240234 00:36:03.200 11:02:38 -- pm/common@50 -- $ sudo -E kill -TERM 2240234 00:36:03.200 + [[ -n 1859321 ]] 00:36:03.200 + sudo kill 1859321 00:36:03.209 [Pipeline] } 00:36:03.228 [Pipeline] // stage 00:36:03.234 [Pipeline] } 00:36:03.253 [Pipeline] // timeout 00:36:03.259 [Pipeline] } 00:36:03.279 [Pipeline] // catchError 00:36:03.283 [Pipeline] } 00:36:03.299 [Pipeline] // wrap 00:36:03.305 [Pipeline] } 00:36:03.315 [Pipeline] // catchError 00:36:03.322 [Pipeline] stage 00:36:03.324 [Pipeline] { (Epilogue) 00:36:03.334 [Pipeline] catchError 00:36:03.335 [Pipeline] { 00:36:03.346 [Pipeline] echo 00:36:03.347 Cleanup processes 00:36:03.351 [Pipeline] sh 00:36:03.628 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:03.628 2240331 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:36:03.628 2240533 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:03.640 [Pipeline] sh 00:36:03.950 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:03.951 ++ grep -v 'sudo pgrep' 00:36:03.951 ++ awk '{print $1}' 00:36:03.951 + sudo kill -9 2240331 00:36:03.963 [Pipeline] sh 00:36:04.247 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:36:16.471 [Pipeline] sh 00:36:16.757 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:36:16.757 Artifacts sizes are good 00:36:16.771 [Pipeline] archiveArtifacts 00:36:16.778 Archiving artifacts 00:36:17.088 [Pipeline] sh 00:36:17.373 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:36:17.392 [Pipeline] cleanWs 00:36:17.404 [WS-CLEANUP] Deleting project workspace... 00:36:17.404 [WS-CLEANUP] Deferred wipeout is used... 00:36:17.411 [WS-CLEANUP] done 00:36:17.414 [Pipeline] } 00:36:17.434 [Pipeline] // catchError 00:36:17.446 [Pipeline] sh 00:36:17.727 + logger -p user.info -t JENKINS-CI 00:36:17.736 [Pipeline] } 00:36:17.752 [Pipeline] // stage 00:36:17.758 [Pipeline] } 00:36:17.774 [Pipeline] // node 00:36:17.780 [Pipeline] End of Pipeline 00:36:17.811 Finished: SUCCESS